[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 28023 1726853607.52064: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Qi7 executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 28023 1726853607.52512: Added group all to inventory 28023 1726853607.52515: Added group ungrouped to inventory 28023 1726853607.52519: Group all now contains ungrouped 28023 1726853607.52522: Examining possible inventory source: /tmp/network-iHm/inventory.yml 28023 1726853607.67852: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 28023 1726853607.67914: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 28023 1726853607.67936: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 28023 1726853607.67997: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 28023 1726853607.68067: Loaded config def from plugin (inventory/script) 28023 1726853607.68069: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 28023 1726853607.68111: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 28023 1726853607.68198: Loaded config def from plugin (inventory/yaml) 28023 1726853607.68200: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 28023 1726853607.68286: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 28023 1726853607.68702: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 28023 1726853607.68705: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 28023 1726853607.68708: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 28023 1726853607.68714: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 28023 1726853607.68718: Loading data from /tmp/network-iHm/inventory.yml 28023 1726853607.68787: /tmp/network-iHm/inventory.yml was not parsable by auto 28023 1726853607.68849: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 28023 1726853607.68888: Loading data from /tmp/network-iHm/inventory.yml 28023 1726853607.68967: group all already in inventory 28023 1726853607.68976: set inventory_file for managed_node1 28023 1726853607.68980: set inventory_dir for managed_node1 28023 1726853607.68981: Added host managed_node1 to inventory 28023 1726853607.68983: Added host managed_node1 to group all 28023 1726853607.68984: set ansible_host for managed_node1 28023 1726853607.68985: set ansible_ssh_extra_args for managed_node1 28023 1726853607.68988: set inventory_file for managed_node2 28023 1726853607.68991: set inventory_dir for managed_node2 28023 1726853607.68992: Added host managed_node2 to inventory 28023 1726853607.68993: Added host managed_node2 to group all 28023 1726853607.68994: set ansible_host for managed_node2 28023 1726853607.68995: set ansible_ssh_extra_args for managed_node2 28023 1726853607.68997: set inventory_file for managed_node3 28023 1726853607.68999: set inventory_dir for managed_node3 28023 1726853607.69000: Added host managed_node3 to inventory 28023 1726853607.69001: Added host managed_node3 to group all 28023 1726853607.69002: set ansible_host for managed_node3 28023 1726853607.69003: set ansible_ssh_extra_args for managed_node3 28023 1726853607.69005: Reconcile groups and hosts in inventory. 28023 1726853607.69009: Group ungrouped now contains managed_node1 28023 1726853607.69011: Group ungrouped now contains managed_node2 28023 1726853607.69012: Group ungrouped now contains managed_node3 28023 1726853607.69086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 28023 1726853607.69208: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 28023 1726853607.69254: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 28023 1726853607.69284: Loaded config def from plugin (vars/host_group_vars) 28023 1726853607.69286: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 28023 1726853607.69293: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 28023 1726853607.69301: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 28023 1726853607.69342: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 28023 1726853607.69663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853607.69758: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 28023 1726853607.69800: Loaded config def from plugin (connection/local) 28023 1726853607.69803: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 28023 1726853607.70454: Loaded config def from plugin (connection/paramiko_ssh) 28023 1726853607.70458: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 28023 1726853607.71326: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 28023 1726853607.71365: Loaded config def from plugin (connection/psrp) 28023 1726853607.71368: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 28023 1726853607.72069: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 28023 1726853607.72112: Loaded config def from plugin (connection/ssh) 28023 1726853607.72115: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 28023 1726853607.73995: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 28023 1726853607.74034: Loaded config def from plugin (connection/winrm) 28023 1726853607.74037: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 28023 1726853607.74067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 28023 1726853607.74128: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 28023 1726853607.74192: Loaded config def from plugin (shell/cmd) 28023 1726853607.74195: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 28023 1726853607.74221: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 28023 1726853607.74288: Loaded config def from plugin (shell/powershell) 28023 1726853607.74291: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 28023 1726853607.74341: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 28023 1726853607.74521: Loaded config def from plugin (shell/sh) 28023 1726853607.74523: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 28023 1726853607.74556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 28023 1726853607.74677: Loaded config def from plugin (become/runas) 28023 1726853607.74680: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 28023 1726853607.74858: Loaded config def from plugin (become/su) 28023 1726853607.74860: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 28023 1726853607.75022: Loaded config def from plugin (become/sudo) 28023 1726853607.75024: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 28023 1726853607.75056: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml 28023 1726853607.75381: in VariableManager get_vars() 28023 1726853607.75403: done with get_vars() 28023 1726853607.75528: trying /usr/local/lib/python3.12/site-packages/ansible/modules 28023 1726853607.78411: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 28023 1726853607.78524: in VariableManager get_vars() 28023 1726853607.78530: done with get_vars() 28023 1726853607.78532: variable 'playbook_dir' from source: magic vars 28023 1726853607.78533: variable 'ansible_playbook_python' from source: magic vars 28023 1726853607.78534: variable 'ansible_config_file' from source: magic vars 28023 1726853607.78535: variable 'groups' from source: magic vars 28023 1726853607.78536: variable 'omit' from source: magic vars 28023 1726853607.78536: variable 'ansible_version' from source: magic vars 28023 1726853607.78537: variable 'ansible_check_mode' from source: magic vars 28023 1726853607.78538: variable 'ansible_diff_mode' from source: magic vars 28023 1726853607.78538: variable 'ansible_forks' from source: magic vars 28023 1726853607.78539: variable 'ansible_inventory_sources' from source: magic vars 28023 1726853607.78540: variable 'ansible_skip_tags' from source: magic vars 28023 1726853607.78540: variable 'ansible_limit' from source: magic vars 28023 1726853607.78541: variable 'ansible_run_tags' from source: magic vars 28023 1726853607.78542: variable 'ansible_verbosity' from source: magic vars 28023 1726853607.78579: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml 28023 1726853607.79390: in VariableManager get_vars() 28023 1726853607.79406: done with get_vars() 28023 1726853607.79443: in VariableManager get_vars() 28023 1726853607.79456: done with get_vars() 28023 1726853607.79494: in VariableManager get_vars() 28023 1726853607.79507: done with get_vars() 28023 1726853607.79609: in VariableManager get_vars() 28023 1726853607.79622: done with get_vars() 28023 1726853607.79656: in VariableManager get_vars() 28023 1726853607.79668: done with get_vars() 28023 1726853607.79718: in VariableManager get_vars() 28023 1726853607.79730: done with get_vars() 28023 1726853607.79787: in VariableManager get_vars() 28023 1726853607.79800: done with get_vars() 28023 1726853607.79804: variable 'omit' from source: magic vars 28023 1726853607.79822: variable 'omit' from source: magic vars 28023 1726853607.79855: in VariableManager get_vars() 28023 1726853607.79865: done with get_vars() 28023 1726853607.79910: in VariableManager get_vars() 28023 1726853607.79922: done with get_vars() 28023 1726853607.79956: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28023 1726853607.80168: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28023 1726853607.80300: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28023 1726853607.80938: in VariableManager get_vars() 28023 1726853607.80957: done with get_vars() 28023 1726853607.81380: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 28023 1726853607.81514: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28023 1726853607.84364: in VariableManager get_vars() 28023 1726853607.84385: done with get_vars() 28023 1726853607.84390: variable 'omit' from source: magic vars 28023 1726853607.84402: variable 'omit' from source: magic vars 28023 1726853607.84434: in VariableManager get_vars() 28023 1726853607.84448: done with get_vars() 28023 1726853607.84473: in VariableManager get_vars() 28023 1726853607.84488: done with get_vars() 28023 1726853607.84517: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28023 1726853607.84631: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28023 1726853607.84710: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28023 1726853607.86944: in VariableManager get_vars() 28023 1726853607.86969: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28023 1726853607.89043: in VariableManager get_vars() 28023 1726853607.89066: done with get_vars() 28023 1726853607.89104: in VariableManager get_vars() 28023 1726853607.89123: done with get_vars() 28023 1726853607.89248: in VariableManager get_vars() 28023 1726853607.89269: done with get_vars() 28023 1726853607.89307: in VariableManager get_vars() 28023 1726853607.89324: done with get_vars() 28023 1726853607.89365: in VariableManager get_vars() 28023 1726853607.89385: done with get_vars() 28023 1726853607.89421: in VariableManager get_vars() 28023 1726853607.89440: done with get_vars() 28023 1726853607.89502: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 28023 1726853607.89516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 28023 1726853607.89757: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 28023 1726853607.89927: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 28023 1726853607.89929: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 28023 1726853607.89961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 28023 1726853607.89987: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 28023 1726853607.90163: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 28023 1726853607.90223: Loaded config def from plugin (callback/default) 28023 1726853607.90226: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 28023 1726853607.91346: Loaded config def from plugin (callback/junit) 28023 1726853607.91348: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 28023 1726853607.91395: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 28023 1726853607.91463: Loaded config def from plugin (callback/minimal) 28023 1726853607.91465: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 28023 1726853607.91505: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 28023 1726853607.91567: Loaded config def from plugin (callback/tree) 28023 1726853607.91569: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 28023 1726853607.91699: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 28023 1726853607.91702: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_route_device_nm.yml ******************************************** 2 plays in /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml 28023 1726853607.91727: in VariableManager get_vars() 28023 1726853607.91738: done with get_vars() 28023 1726853607.91743: in VariableManager get_vars() 28023 1726853607.91751: done with get_vars() 28023 1726853607.91757: variable 'omit' from source: magic vars 28023 1726853607.91792: in VariableManager get_vars() 28023 1726853607.91805: done with get_vars() 28023 1726853607.91823: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_route_device.yml' with nm as provider] ***** 28023 1726853607.92343: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 28023 1726853607.92416: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 28023 1726853607.92447: getting the remaining hosts for this loop 28023 1726853607.92449: done getting the remaining hosts for this loop 28023 1726853607.92452: getting the next task for host managed_node3 28023 1726853607.92458: done getting next task for host managed_node3 28023 1726853607.92460: ^ task is: TASK: Gathering Facts 28023 1726853607.92461: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853607.92464: getting variables 28023 1726853607.92465: in VariableManager get_vars() 28023 1726853607.92477: Calling all_inventory to load vars for managed_node3 28023 1726853607.92479: Calling groups_inventory to load vars for managed_node3 28023 1726853607.92482: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853607.92494: Calling all_plugins_play to load vars for managed_node3 28023 1726853607.92505: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853607.92508: Calling groups_plugins_play to load vars for managed_node3 28023 1726853607.92539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853607.92597: done with get_vars() 28023 1726853607.92603: done getting variables 28023 1726853607.92668: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:6 Friday 20 September 2024 13:33:27 -0400 (0:00:00.010) 0:00:00.010 ****** 28023 1726853607.92691: entering _queue_task() for managed_node3/gather_facts 28023 1726853607.92693: Creating lock for gather_facts 28023 1726853607.93069: worker is 1 (out of 1 available) 28023 1726853607.93081: exiting _queue_task() for managed_node3/gather_facts 28023 1726853607.93093: done queuing things up, now waiting for results queue to drain 28023 1726853607.93094: waiting for pending results... 28023 1726853607.93691: running TaskExecutor() for managed_node3/TASK: Gathering Facts 28023 1726853607.93696: in run() - task 02083763-bbaf-fdb6-dad7-0000000000bf 28023 1726853607.93716: variable 'ansible_search_path' from source: unknown 28023 1726853607.93816: calling self._execute() 28023 1726853607.93886: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853607.94077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853607.94081: variable 'omit' from source: magic vars 28023 1726853607.94176: variable 'omit' from source: magic vars 28023 1726853607.94238: variable 'omit' from source: magic vars 28023 1726853607.94278: variable 'omit' from source: magic vars 28023 1726853607.94336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853607.94382: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853607.94411: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853607.94443: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853607.94462: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853607.94499: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853607.94508: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853607.94549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853607.94626: Set connection var ansible_shell_type to sh 28023 1726853607.94640: Set connection var ansible_shell_executable to /bin/sh 28023 1726853607.94656: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853607.94667: Set connection var ansible_connection to ssh 28023 1726853607.94678: Set connection var ansible_pipelining to False 28023 1726853607.94767: Set connection var ansible_timeout to 10 28023 1726853607.94772: variable 'ansible_shell_executable' from source: unknown 28023 1726853607.94775: variable 'ansible_connection' from source: unknown 28023 1726853607.94777: variable 'ansible_module_compression' from source: unknown 28023 1726853607.94779: variable 'ansible_shell_type' from source: unknown 28023 1726853607.94782: variable 'ansible_shell_executable' from source: unknown 28023 1726853607.94784: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853607.94786: variable 'ansible_pipelining' from source: unknown 28023 1726853607.94788: variable 'ansible_timeout' from source: unknown 28023 1726853607.94791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853607.94955: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853607.94977: variable 'omit' from source: magic vars 28023 1726853607.94991: starting attempt loop 28023 1726853607.94998: running the handler 28023 1726853607.95016: variable 'ansible_facts' from source: unknown 28023 1726853607.95043: _low_level_execute_command(): starting 28023 1726853607.95058: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853607.95851: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853607.95867: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853607.95974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853607.95978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853607.96016: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853607.96039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853607.96086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853607.96152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853607.97868: stdout chunk (state=3): >>>/root <<< 28023 1726853607.98003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853607.98166: stdout chunk (state=3): >>><<< 28023 1726853607.98169: stderr chunk (state=3): >>><<< 28023 1726853607.98173: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853607.98175: _low_level_execute_command(): starting 28023 1726853607.98178: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853607.9809022-28042-235173589384772 `" && echo ansible-tmp-1726853607.9809022-28042-235173589384772="` echo /root/.ansible/tmp/ansible-tmp-1726853607.9809022-28042-235173589384772 `" ) && sleep 0' 28023 1726853607.99268: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853607.99377: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853607.99393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853607.99410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853607.99484: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853607.99602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853607.99620: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853607.99723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853607.99855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853608.02062: stdout chunk (state=3): >>>ansible-tmp-1726853607.9809022-28042-235173589384772=/root/.ansible/tmp/ansible-tmp-1726853607.9809022-28042-235173589384772 <<< 28023 1726853608.02149: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853608.02152: stdout chunk (state=3): >>><<< 28023 1726853608.02157: stderr chunk (state=3): >>><<< 28023 1726853608.02377: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853607.9809022-28042-235173589384772=/root/.ansible/tmp/ansible-tmp-1726853607.9809022-28042-235173589384772 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853608.02381: variable 'ansible_module_compression' from source: unknown 28023 1726853608.02384: ANSIBALLZ: Using generic lock for ansible.legacy.setup 28023 1726853608.02386: ANSIBALLZ: Acquiring lock 28023 1726853608.02388: ANSIBALLZ: Lock acquired: 139729396667488 28023 1726853608.02390: ANSIBALLZ: Creating module 28023 1726853608.41268: ANSIBALLZ: Writing module into payload 28023 1726853608.41435: ANSIBALLZ: Writing module 28023 1726853608.41457: ANSIBALLZ: Renaming module 28023 1726853608.41461: ANSIBALLZ: Done creating module 28023 1726853608.41485: variable 'ansible_facts' from source: unknown 28023 1726853608.41492: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853608.41501: _low_level_execute_command(): starting 28023 1726853608.41508: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 28023 1726853608.42165: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853608.42289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853608.42307: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853608.42332: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853608.42418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853608.44212: stdout chunk (state=3): >>>PLATFORM Linux FOUND <<< 28023 1726853608.44217: stdout chunk (state=3): >>>/usr/bin/python3.12 /usr/bin/python3 <<< 28023 1726853608.44220: stdout chunk (state=3): >>>/usr/bin/python3 ENDFOUND <<< 28023 1726853608.44626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853608.44630: stdout chunk (state=3): >>><<< 28023 1726853608.44633: stderr chunk (state=3): >>><<< 28023 1726853608.44651: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853608.44662 [managed_node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 28023 1726853608.44717: _low_level_execute_command(): starting 28023 1726853608.44720: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 28023 1726853608.45052: Sending initial data 28023 1726853608.45058: Sent initial data (1181 bytes) 28023 1726853608.45594: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853608.45608: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853608.45616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853608.45630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853608.45636: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853608.45657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 28023 1726853608.45661: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853608.45682: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853608.45687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853608.45767: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853608.45815: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853608.45866: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853608.49457: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 28023 1726853608.50085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853608.50089: stdout chunk (state=3): >>><<< 28023 1726853608.50092: stderr chunk (state=3): >>><<< 28023 1726853608.50094: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853608.50098: variable 'ansible_facts' from source: unknown 28023 1726853608.50100: variable 'ansible_facts' from source: unknown 28023 1726853608.50102: variable 'ansible_module_compression' from source: unknown 28023 1726853608.50108: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 28023 1726853608.50143: variable 'ansible_facts' from source: unknown 28023 1726853608.50337: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853607.9809022-28042-235173589384772/AnsiballZ_setup.py 28023 1726853608.50491: Sending initial data 28023 1726853608.50524: Sent initial data (154 bytes) 28023 1726853608.51203: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853608.51291: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853608.51331: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853608.51349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853608.51388: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853608.51521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853608.53187: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 28023 1726853608.53193: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 28023 1726853608.53201: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 28023 1726853608.53209: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 28023 1726853608.53216: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 28023 1726853608.53231: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853608.53308: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853608.53397: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmp62pglyoi /root/.ansible/tmp/ansible-tmp-1726853607.9809022-28042-235173589384772/AnsiballZ_setup.py <<< 28023 1726853608.53401: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853607.9809022-28042-235173589384772/AnsiballZ_setup.py" <<< 28023 1726853608.53458: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmp62pglyoi" to remote "/root/.ansible/tmp/ansible-tmp-1726853607.9809022-28042-235173589384772/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853607.9809022-28042-235173589384772/AnsiballZ_setup.py" <<< 28023 1726853608.55594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853608.55598: stdout chunk (state=3): >>><<< 28023 1726853608.55601: stderr chunk (state=3): >>><<< 28023 1726853608.55603: done transferring module to remote 28023 1726853608.55658: _low_level_execute_command(): starting 28023 1726853608.55662: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853607.9809022-28042-235173589384772/ /root/.ansible/tmp/ansible-tmp-1726853607.9809022-28042-235173589384772/AnsiballZ_setup.py && sleep 0' 28023 1726853608.56269: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853608.56281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853608.56476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853608.56480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853608.56482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853608.56489: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853608.56492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853608.56494: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853608.56496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853608.56561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853608.58790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853608.58795: stdout chunk (state=3): >>><<< 28023 1726853608.58797: stderr chunk (state=3): >>><<< 28023 1726853608.58800: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853608.58802: _low_level_execute_command(): starting 28023 1726853608.58804: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853607.9809022-28042-235173589384772/AnsiballZ_setup.py && sleep 0' 28023 1726853608.59795: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853608.59867: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853608.59873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853608.60001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28023 1726853608.63006: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 28023 1726853608.63062: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 28023 1726853608.63112: stdout chunk (state=3): >>>import 'posix' # <<< 28023 1726853608.63147: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 28023 1726853608.63507: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845ce04d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845cafb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845ce2a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 28023 1726853608.63623: stdout chunk (state=3): >>>import '_collections_abc' # <<< 28023 1726853608.63655: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 28023 1726853608.63695: stdout chunk (state=3): >>>import 'os' # <<< 28023 1726853608.63736: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 28023 1726853608.63739: stdout chunk (state=3): >>>Processing user site-packages <<< 28023 1726853608.63742: stdout chunk (state=3): >>>Processing global site-packages <<< 28023 1726853608.63765: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 28023 1726853608.63770: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 28023 1726853608.63808: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 28023 1726853608.63847: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845a91130> <<< 28023 1726853608.63911: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 28023 1726853608.63924: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 28023 1726853608.63940: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845a91fa0> <<< 28023 1726853608.63973: stdout chunk (state=3): >>>import 'site' # <<< 28023 1726853608.64010: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 28023 1726853608.64651: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 28023 1726853608.64669: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 28023 1726853608.64709: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 28023 1726853608.64762: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 28023 1726853608.64817: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 28023 1726853608.64835: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845acfdd0> <<< 28023 1726853608.64861: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 28023 1726853608.64880: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 28023 1726853608.64936: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845acffe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 28023 1726853608.64963: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 28023 1726853608.64997: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 28023 1726853608.65068: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 28023 1726853608.65110: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b07800> <<< 28023 1726853608.65158: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b07e90> <<< 28023 1726853608.65180: stdout chunk (state=3): >>>import '_collections' # <<< 28023 1726853608.65249: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845ae7aa0> import '_functools' # <<< 28023 1726853608.65298: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845ae51c0> <<< 28023 1726853608.65452: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845accf80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 28023 1726853608.65469: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 28023 1726853608.65493: stdout chunk (state=3): >>>import '_sre' # <<< 28023 1726853608.65542: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 28023 1726853608.65576: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 28023 1726853608.65588: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 28023 1726853608.65617: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b276e0> <<< 28023 1726853608.65679: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b26300> <<< 28023 1726853608.65716: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845ae6060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845acee70> <<< 28023 1726853608.65762: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b5c7a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845acc200> <<< 28023 1726853608.65821: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845b5cc50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b5cb00> <<< 28023 1726853608.65873: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853608.65900: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845b5cef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845acad20> <<< 28023 1726853608.65928: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 28023 1726853608.65991: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 28023 1726853608.66016: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b5d5b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b5d280> <<< 28023 1726853608.66039: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 28023 1726853608.66085: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 28023 1726853608.66100: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b5e4b0> import 'importlib.util' # <<< 28023 1726853608.66130: stdout chunk (state=3): >>>import 'runpy' # <<< 28023 1726853608.66179: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 28023 1726853608.66211: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b74680> <<< 28023 1726853608.66240: stdout chunk (state=3): >>>import 'errno' # <<< 28023 1726853608.66275: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853608.66301: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845b75d30> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 28023 1726853608.66314: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 28023 1726853608.66347: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 28023 1726853608.66404: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b76bd0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845b77230> <<< 28023 1726853608.66430: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b76120> <<< 28023 1726853608.66450: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 28023 1726853608.66479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 28023 1726853608.66495: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853608.66522: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845b77cb0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b773e0> <<< 28023 1726853608.66621: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b5e450> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 28023 1726853608.66656: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc'<<< 28023 1726853608.66691: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 28023 1726853608.66733: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc'<<< 28023 1726853608.66791: stdout chunk (state=3): >>> # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so'<<< 28023 1726853608.66809: stdout chunk (state=3): >>> # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845867b90> <<< 28023 1726853608.66847: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py<<< 28023 1726853608.66906: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so'<<< 28023 1726853608.66910: stdout chunk (state=3): >>> # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so'<<< 28023 1726853608.66961: stdout chunk (state=3): >>> import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845890650> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8458903b0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853608.66964: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so'<<< 28023 1726853608.66967: stdout chunk (state=3): >>> <<< 28023 1726853608.67044: stdout chunk (state=3): >>>import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845890680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 28023 1726853608.67137: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853608.67312: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845890fb0> <<< 28023 1726853608.67448: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845891910> <<< 28023 1726853608.67469: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845890860> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845865d60> <<< 28023 1726853608.67497: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 28023 1726853608.67525: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 28023 1726853608.67557: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845892cc0> <<< 28023 1726853608.67593: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8458917f0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b5eba0> <<< 28023 1726853608.67608: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 28023 1726853608.67696: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 28023 1726853608.67706: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 28023 1726853608.67742: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 28023 1726853608.67761: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8458bf020> <<< 28023 1726853608.67833: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 28023 1726853608.67837: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 28023 1726853608.67873: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 28023 1726853608.67877: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 28023 1726853608.67935: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8458e3410> <<< 28023 1726853608.67939: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 28023 1726853608.67995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 28023 1726853608.68036: stdout chunk (state=3): >>>import 'ntpath' # <<< 28023 1726853608.68068: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8459401a0> <<< 28023 1726853608.68084: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 28023 1726853608.68117: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 28023 1726853608.68146: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 28023 1726853608.68178: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 28023 1726853608.68264: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845942900> <<< 28023 1726853608.68354: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8459402c0> <<< 28023 1726853608.68404: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84590d1c0> <<< 28023 1726853608.68410: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 28023 1726853608.68434: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8457492e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8458e2210> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845893bf0> <<< 28023 1726853608.68624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 28023 1726853608.68646: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff8458e2570> <<< 28023 1726853608.68983: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_b7u5d_20/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 28023 1726853608.69097: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.69144: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 28023 1726853608.69197: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 28023 1726853608.69403: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 28023 1726853608.69418: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8457ab020> import '_typing' # <<< 28023 1726853608.69488: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845789f10> <<< 28023 1726853608.69522: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8457890a0> # zipimport: zlib available import 'ansible' # <<< 28023 1726853608.69547: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28023 1726853608.69573: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 28023 1726853608.69594: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.71014: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.72213: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 28023 1726853608.72245: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8457a9310> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 28023 1726853608.72279: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 28023 1726853608.72305: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 28023 1726853608.72336: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853608.72346: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8457da9c0> <<< 28023 1726853608.72380: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8457da750> <<< 28023 1726853608.72411: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8457da060> <<< 28023 1726853608.72433: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 28023 1726853608.72477: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8457da7b0> <<< 28023 1726853608.72497: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845ce29c0> import 'atexit' # <<< 28023 1726853608.72525: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8457db740> <<< 28023 1726853608.72547: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8457db980> <<< 28023 1726853608.72566: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 28023 1726853608.72631: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 28023 1726853608.72685: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8457dbec0> <<< 28023 1726853608.72715: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 28023 1726853608.72733: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 28023 1726853608.72776: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845129cd0> <<< 28023 1726853608.72807: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853608.72810: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff84512b8f0> <<< 28023 1726853608.72844: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 28023 1726853608.72858: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 28023 1726853608.72893: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84512c230> <<< 28023 1726853608.72897: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 28023 1726853608.72923: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 28023 1726853608.72961: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84512d3d0> <<< 28023 1726853608.72976: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 28023 1726853608.73016: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 28023 1726853608.73019: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 28023 1726853608.73045: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 28023 1726853608.73078: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84512fe00> <<< 28023 1726853608.73117: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff84590fdd0> <<< 28023 1726853608.73146: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84512e0c0> <<< 28023 1726853608.73164: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 28023 1726853608.73277: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 28023 1726853608.73291: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 28023 1726853608.73430: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845137e30> import '_tokenize' # <<< 28023 1726853608.73481: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845136900> <<< 28023 1726853608.73502: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845136660> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 28023 1726853608.73727: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845136bd0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84512e540> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff84517bef0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84517c1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 28023 1726853608.73793: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853608.73815: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff84517dc40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84517da00> <<< 28023 1726853608.73818: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 28023 1726853608.73820: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 28023 1726853608.73875: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853608.73879: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8451801d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84517e300> <<< 28023 1726853608.74040: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 28023 1726853608.74044: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845183980> <<< 28023 1726853608.74145: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845180380> <<< 28023 1726853608.74206: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853608.74237: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845184740> <<< 28023 1726853608.74289: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845184aa0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845184b30> <<< 28023 1726853608.74348: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84517c2c0> <<< 28023 1726853608.74351: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 28023 1726853608.74391: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853608.74466: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8450103e0> <<< 28023 1726853608.74573: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853608.74611: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845011250> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845186b70> <<< 28023 1726853608.74690: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845187ef0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845186780> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 28023 1726853608.74779: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.74860: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.74938: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 28023 1726853608.74951: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.75052: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.75168: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.75735: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.76299: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 28023 1726853608.76304: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 28023 1726853608.76349: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 28023 1726853608.76353: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 28023 1726853608.76405: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8450155b0> <<< 28023 1726853608.76502: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 28023 1726853608.76516: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8450169c0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8450113a0> <<< 28023 1726853608.76583: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 28023 1726853608.76606: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28023 1726853608.76628: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 28023 1726853608.76644: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.76783: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.76951: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 28023 1726853608.76955: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845016b10> <<< 28023 1726853608.76977: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.77489: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.77883: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.77953: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.78050: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 28023 1726853608.78053: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.78092: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.78116: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 28023 1726853608.78194: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.78293: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 28023 1726853608.78298: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.78320: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 28023 1726853608.78360: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.78402: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 28023 1726853608.78639: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.79103: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 28023 1726853608.79144: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8450176e0> <<< 28023 1726853608.79159: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.79274: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.79386: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 28023 1726853608.79419: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.79478: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.79531: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 28023 1726853608.79552: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.79657: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28023 1726853608.79738: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.79839: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 28023 1726853608.79896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 28023 1726853608.80009: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845021f40> <<< 28023 1726853608.80088: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84501fd70> <<< 28023 1726853608.80102: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 28023 1726853608.80108: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.80202: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.80345: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28023 1726853608.80403: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 28023 1726853608.80415: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 28023 1726853608.80421: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 28023 1726853608.80476: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 28023 1726853608.80578: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 28023 1726853608.80614: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 28023 1726853608.80700: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84510a810> <<< 28023 1726853608.80795: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8457fa4e0> <<< 28023 1726853608.80872: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845021d90> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845016f90> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 28023 1726853608.80919: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28023 1726853608.80949: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 28023 1726853608.80954: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 28023 1726853608.81044: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 28023 1726853608.81063: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 28023 1726853608.81163: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.81168: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.81250: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.81274: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.81296: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.81386: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.81409: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.81462: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.81521: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 28023 1726853608.81696: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.81784: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28023 1726853608.81826: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 28023 1726853608.82268: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.82589: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28023 1726853608.82598: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8450b5fa0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 28023 1726853608.82602: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 28023 1726853608.82604: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 28023 1726853608.82641: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844c97ef0> <<< 28023 1726853608.82666: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853608.82684: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff844c9c290> <<< 28023 1726853608.82735: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84509eab0> <<< 28023 1726853608.82812: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8450b6ab0> <<< 28023 1726853608.82815: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8450b4620> <<< 28023 1726853608.82833: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8450b42c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 28023 1726853608.82899: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 28023 1726853608.82937: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 28023 1726853608.83029: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 28023 1726853608.83034: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff844c9f1d0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844c9eab0> <<< 28023 1726853608.83037: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff844c9ec60> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844c9dee0> <<< 28023 1726853608.83068: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 28023 1726853608.83309: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844c9f260> <<< 28023 1726853608.83313: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 28023 1726853608.83365: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff844d05d30> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844c9fd10> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8450b42f0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 28023 1726853608.83368: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 28023 1726853608.83387: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.83439: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.83513: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 28023 1726853608.83563: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28023 1726853608.83635: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 28023 1726853608.83674: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 28023 1726853608.83678: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.83734: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 28023 1726853608.83760: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.83796: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.83837: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 28023 1726853608.84065: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available <<< 28023 1726853608.84297: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 28023 1726853608.85066: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.85776: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 28023 1726853608.85800: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.85886: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.85972: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.86018: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.86090: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 28023 1726853608.86107: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.86146: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.86203: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 28023 1726853608.86305: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.86382: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 28023 1726853608.86416: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.86454: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.86502: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 28023 1726853608.86689: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 28023 1726853608.86729: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.86867: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 28023 1726853608.86884: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 28023 1726853608.86909: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844d078c0> <<< 28023 1726853608.86949: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 28023 1726853608.86992: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 28023 1726853608.87189: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844d067b0> <<< 28023 1726853608.87217: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 28023 1726853608.87318: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.87431: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 28023 1726853608.87435: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.87569: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.87702: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 28023 1726853608.87893: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28023 1726853608.87914: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 28023 1726853608.87947: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.88001: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.88082: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 28023 1726853608.88159: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 28023 1726853608.88502: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff844d39e80> <<< 28023 1726853608.88670: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844d074d0> <<< 28023 1726853608.88675: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 28023 1726853608.88696: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.88776: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.88863: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 28023 1726853608.88882: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.89004: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.89136: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.89314: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.89549: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 28023 1726853608.89552: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 28023 1726853608.89597: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.89649: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 28023 1726853608.89723: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.89792: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 28023 1726853608.89797: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 28023 1726853608.89825: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853608.89853: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff844d51970> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844d515b0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 28023 1726853608.89901: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 28023 1726853608.89905: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.89944: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.89990: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 28023 1726853608.90003: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.90409: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 28023 1726853608.90473: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.90575: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.90615: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.90665: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 28023 1726853608.90668: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 28023 1726853608.90674: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.90730: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28023 1726853608.90909: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.91137: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 28023 1726853608.91333: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 28023 1726853608.91897: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.92613: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 28023 1726853608.92626: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.92772: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.92927: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 28023 1726853608.93000: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.93090: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.93233: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 28023 1726853608.93255: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.93488: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.93799: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 28023 1726853608.93844: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28023 1726853608.93889: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 28023 1726853608.94054: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.94205: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.94534: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.94812: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available <<< 28023 1726853608.94918: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # <<< 28023 1726853608.94935: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.95379: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available <<< 28023 1726853608.95597: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 28023 1726853608.95785: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.96092: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available <<< 28023 1726853608.96155: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 28023 1726853608.96193: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.96243: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 28023 1726853608.96269: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.96285: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.96413: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # <<< 28023 1726853608.96459: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.96610: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available <<< 28023 1726853608.96613: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.96615: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # <<< 28023 1726853608.96786: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 28023 1726853608.96838: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.96859: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.96928: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.97015: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 28023 1726853608.97019: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 28023 1726853608.97165: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.97174: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 28023 1726853608.97177: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.97386: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.97600: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available <<< 28023 1726853608.97652: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 28023 1726853608.97676: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.97761: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 28023 1726853608.97816: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.97926: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 28023 1726853608.98000: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.98101: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 28023 1726853608.98104: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 28023 1726853608.98185: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853608.98899: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 28023 1726853608.99094: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff844ae61e0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844ae46e0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844adfef0> <<< 28023 1726853609.28064: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 28023 1726853609.28088: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 28023 1726853609.28104: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844b2d0a0> <<< 28023 1726853609.28128: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 28023 1726853609.28140: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 28023 1726853609.28434: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844b2de50> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844b74440> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844b76150> <<< 28023 1726853609.28689: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 28023 1726853609.49444: stdout chunk (state=3): >>> <<< 28023 1726853609.49597: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["rpltstbr", "lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:7c:f1:8e:1c:81", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.458984375, "5m": 0.47607421875, "15m": 0.28466796875}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2983, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 548, "free": 2983}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 753, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261802692608, "block_size": 4096, "block_total": 65519099, "block_available": 63916673, "block_used": 1602426, "inode_total": 131070960, "inode_available": 131029145, "inode_used": 41815, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_service_mgr": "systemd", "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "33", "second": "29", "epoch": "1726853609", "epoch_int": "1726853609", "date": "2024-09-20", "time": "13:33:29", "iso8601_micro": "2024-09-20T17:33:29.490922Z", "iso8601": "2024-09-20T17:33:29Z", "iso8601_basic": "20240920T133329490922", "iso8601_basic_short": "20240920T133329", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 28023 1726853609.50537: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanu<<< 28023 1726853609.50582: stdout chunk (state=3): >>>p[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2<<< 28023 1726853609.50616: stdout chunk (state=3): >>>] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 28023 1726853609.50940: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 28023 1726853609.50976: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 28023 1726853609.51011: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 28023 1726853609.51038: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 28023 1726853609.51097: stdout chunk (state=3): >>># destroy ntpath # destroy importlib <<< 28023 1726853609.51127: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 28023 1726853609.51151: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select <<< 28023 1726853609.51176: stdout chunk (state=3): >>># destroy _signal # destroy _posixsubprocess <<< 28023 1726853609.51222: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy selinux <<< 28023 1726853609.51248: stdout chunk (state=3): >>># destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 28023 1726853609.51317: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle <<< 28023 1726853609.51344: stdout chunk (state=3): >>># destroy _pickle # destroy queue <<< 28023 1726853609.51372: stdout chunk (state=3): >>># destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors <<< 28023 1726853609.51405: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime <<< 28023 1726853609.51423: stdout chunk (state=3): >>># destroy subprocess # destroy base64 <<< 28023 1726853609.51470: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux <<< 28023 1726853609.51484: stdout chunk (state=3): >>># destroy getpass # destroy pwd # destroy termios # destroy json <<< 28023 1726853609.51504: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 28023 1726853609.51546: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing <<< 28023 1726853609.51553: stdout chunk (state=3): >>># destroy array # destroy multiprocessing.dummy.connection <<< 28023 1726853609.51625: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian <<< 28023 1726853609.51657: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize <<< 28023 1726853609.51661: stdout chunk (state=3): >>># cleanup[3] wiping platform <<< 28023 1726853609.51677: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 28023 1726853609.51701: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 28023 1726853609.51729: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 28023 1726853609.51787: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io <<< 28023 1726853609.51790: stdout chunk (state=3): >>># cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 28023 1726853609.51860: stdout chunk (state=3): >>># cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 28023 1726853609.52112: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 28023 1726853609.52173: stdout chunk (state=3): >>># destroy _collections <<< 28023 1726853609.52177: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 28023 1726853609.52205: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 28023 1726853609.52233: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 28023 1726853609.52256: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize <<< 28023 1726853609.52319: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 28023 1726853609.52335: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 28023 1726853609.52456: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna <<< 28023 1726853609.52459: stdout chunk (state=3): >>># destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 28023 1726853609.52518: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib <<< 28023 1726853609.52561: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re <<< 28023 1726853609.52565: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 28023 1726853609.52642: stdout chunk (state=3): >>># clear sys.audit hooks <<< 28023 1726853609.53217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853609.53221: stdout chunk (state=3): >>><<< 28023 1726853609.53223: stderr chunk (state=3): >>><<< 28023 1726853609.53429: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845ce04d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845cafb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845ce2a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845a91130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845a91fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845acfdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845acffe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b07800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b07e90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845ae7aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845ae51c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845accf80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b276e0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b26300> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845ae6060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845acee70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b5c7a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845acc200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845b5cc50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b5cb00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845b5cef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845acad20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b5d5b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b5d280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b5e4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b74680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845b75d30> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b76bd0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845b77230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b76120> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845b77cb0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b773e0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b5e450> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845867b90> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845890650> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8458903b0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845890680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845890fb0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845891910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845890860> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845865d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845892cc0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8458917f0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845b5eba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8458bf020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8458e3410> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8459401a0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845942900> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8459402c0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84590d1c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8457492e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8458e2210> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845893bf0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff8458e2570> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_b7u5d_20/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8457ab020> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845789f10> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8457890a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8457a9310> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8457da9c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8457da750> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8457da060> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8457da7b0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845ce29c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8457db740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8457db980> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8457dbec0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845129cd0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff84512b8f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84512c230> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84512d3d0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84512fe00> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff84590fdd0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84512e0c0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845137e30> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845136900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845136660> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845136bd0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84512e540> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff84517bef0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84517c1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff84517dc40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84517da00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8451801d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84517e300> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845183980> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845180380> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845184740> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845184aa0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845184b30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84517c2c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8450103e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845011250> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845186b70> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845187ef0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845186780> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff8450155b0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8450169c0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8450113a0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845016b10> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8450176e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff845021f40> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84501fd70> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84510a810> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8457fa4e0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845021d90> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff845016f90> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8450b5fa0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844c97ef0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff844c9c290> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff84509eab0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8450b6ab0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8450b4620> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8450b42c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff844c9f1d0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844c9eab0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff844c9ec60> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844c9dee0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844c9f260> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff844d05d30> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844c9fd10> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff8450b42f0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844d078c0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844d067b0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff844d39e80> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844d074d0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff844d51970> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844d515b0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff844ae61e0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844ae46e0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844adfef0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844b2d0a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844b2de50> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844b74440> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff844b76150> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["rpltstbr", "lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:7c:f1:8e:1c:81", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.458984375, "5m": 0.47607421875, "15m": 0.28466796875}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2983, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 548, "free": 2983}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 753, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261802692608, "block_size": 4096, "block_total": 65519099, "block_available": 63916673, "block_used": 1602426, "inode_total": 131070960, "inode_available": 131029145, "inode_used": 41815, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_service_mgr": "systemd", "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "33", "second": "29", "epoch": "1726853609", "epoch_int": "1726853609", "date": "2024-09-20", "time": "13:33:29", "iso8601_micro": "2024-09-20T17:33:29.490922Z", "iso8601": "2024-09-20T17:33:29Z", "iso8601_basic": "20240920T133329490922", "iso8601_basic_short": "20240920T133329", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 28023 1726853609.55795: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853607.9809022-28042-235173589384772/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853609.55803: _low_level_execute_command(): starting 28023 1726853609.55806: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853607.9809022-28042-235173589384772/ > /dev/null 2>&1 && sleep 0' 28023 1726853609.56988: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853609.57116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853609.57140: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853609.57161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853609.57223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853609.59804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853609.59808: stdout chunk (state=3): >>><<< 28023 1726853609.59814: stderr chunk (state=3): >>><<< 28023 1726853609.59831: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853609.59840: handler run complete 28023 1726853609.60026: variable 'ansible_facts' from source: unknown 28023 1726853609.60128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853609.60667: variable 'ansible_facts' from source: unknown 28023 1726853609.60862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853609.61296: attempt loop complete, returning result 28023 1726853609.61299: _execute() done 28023 1726853609.61302: dumping result to json 28023 1726853609.61337: done dumping result, returning 28023 1726853609.61344: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [02083763-bbaf-fdb6-dad7-0000000000bf] 28023 1726853609.61347: sending task result for task 02083763-bbaf-fdb6-dad7-0000000000bf ok: [managed_node3] 28023 1726853609.62547: no more pending results, returning what we have 28023 1726853609.62550: results queue empty 28023 1726853609.62551: checking for any_errors_fatal 28023 1726853609.62552: done checking for any_errors_fatal 28023 1726853609.62553: checking for max_fail_percentage 28023 1726853609.62556: done checking for max_fail_percentage 28023 1726853609.62557: checking to see if all hosts have failed and the running result is not ok 28023 1726853609.62558: done checking to see if all hosts have failed 28023 1726853609.62559: getting the remaining hosts for this loop 28023 1726853609.62561: done getting the remaining hosts for this loop 28023 1726853609.62564: getting the next task for host managed_node3 28023 1726853609.62570: done getting next task for host managed_node3 28023 1726853609.62574: ^ task is: TASK: meta (flush_handlers) 28023 1726853609.62576: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853609.62580: getting variables 28023 1726853609.62581: in VariableManager get_vars() 28023 1726853609.62604: Calling all_inventory to load vars for managed_node3 28023 1726853609.62607: Calling groups_inventory to load vars for managed_node3 28023 1726853609.62610: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853609.62616: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000000bf 28023 1726853609.62622: WORKER PROCESS EXITING 28023 1726853609.62634: Calling all_plugins_play to load vars for managed_node3 28023 1726853609.62637: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853609.62641: Calling groups_plugins_play to load vars for managed_node3 28023 1726853609.62845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853609.63096: done with get_vars() 28023 1726853609.63106: done getting variables 28023 1726853609.63180: in VariableManager get_vars() 28023 1726853609.63189: Calling all_inventory to load vars for managed_node3 28023 1726853609.63191: Calling groups_inventory to load vars for managed_node3 28023 1726853609.63193: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853609.63197: Calling all_plugins_play to load vars for managed_node3 28023 1726853609.63199: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853609.63202: Calling groups_plugins_play to load vars for managed_node3 28023 1726853609.63336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853609.63518: done with get_vars() 28023 1726853609.63530: done queuing things up, now waiting for results queue to drain 28023 1726853609.63535: results queue empty 28023 1726853609.63536: checking for any_errors_fatal 28023 1726853609.63539: done checking for any_errors_fatal 28023 1726853609.63539: checking for max_fail_percentage 28023 1726853609.63540: done checking for max_fail_percentage 28023 1726853609.63541: checking to see if all hosts have failed and the running result is not ok 28023 1726853609.63552: done checking to see if all hosts have failed 28023 1726853609.63557: getting the remaining hosts for this loop 28023 1726853609.63559: done getting the remaining hosts for this loop 28023 1726853609.63569: getting the next task for host managed_node3 28023 1726853609.63577: done getting next task for host managed_node3 28023 1726853609.63580: ^ task is: TASK: Include the task 'el_repo_setup.yml' 28023 1726853609.63581: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853609.63583: getting variables 28023 1726853609.63584: in VariableManager get_vars() 28023 1726853609.63591: Calling all_inventory to load vars for managed_node3 28023 1726853609.63593: Calling groups_inventory to load vars for managed_node3 28023 1726853609.63595: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853609.63598: Calling all_plugins_play to load vars for managed_node3 28023 1726853609.63600: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853609.63602: Calling groups_plugins_play to load vars for managed_node3 28023 1726853609.63851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853609.64293: done with get_vars() 28023 1726853609.64300: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:11 Friday 20 September 2024 13:33:29 -0400 (0:00:01.718) 0:00:01.728 ****** 28023 1726853609.64499: entering _queue_task() for managed_node3/include_tasks 28023 1726853609.64501: Creating lock for include_tasks 28023 1726853609.65152: worker is 1 (out of 1 available) 28023 1726853609.65169: exiting _queue_task() for managed_node3/include_tasks 28023 1726853609.65183: done queuing things up, now waiting for results queue to drain 28023 1726853609.65185: waiting for pending results... 28023 1726853609.65438: running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' 28023 1726853609.65543: in run() - task 02083763-bbaf-fdb6-dad7-000000000006 28023 1726853609.65564: variable 'ansible_search_path' from source: unknown 28023 1726853609.65607: calling self._execute() 28023 1726853609.65684: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853609.65695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853609.65709: variable 'omit' from source: magic vars 28023 1726853609.65846: _execute() done 28023 1726853609.65850: dumping result to json 28023 1726853609.65856: done dumping result, returning 28023 1726853609.65859: done running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' [02083763-bbaf-fdb6-dad7-000000000006] 28023 1726853609.65936: sending task result for task 02083763-bbaf-fdb6-dad7-000000000006 28023 1726853609.66208: no more pending results, returning what we have 28023 1726853609.66214: in VariableManager get_vars() 28023 1726853609.66247: Calling all_inventory to load vars for managed_node3 28023 1726853609.66250: Calling groups_inventory to load vars for managed_node3 28023 1726853609.66253: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853609.66269: Calling all_plugins_play to load vars for managed_node3 28023 1726853609.66274: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853609.66278: Calling groups_plugins_play to load vars for managed_node3 28023 1726853609.66569: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000006 28023 1726853609.66575: WORKER PROCESS EXITING 28023 1726853609.66602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853609.66831: done with get_vars() 28023 1726853609.66838: variable 'ansible_search_path' from source: unknown 28023 1726853609.66851: we have included files to process 28023 1726853609.66852: generating all_blocks data 28023 1726853609.66853: done generating all_blocks data 28023 1726853609.66853: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 28023 1726853609.66854: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 28023 1726853609.66857: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 28023 1726853609.67514: in VariableManager get_vars() 28023 1726853609.67530: done with get_vars() 28023 1726853609.67541: done processing included file 28023 1726853609.67543: iterating over new_blocks loaded from include file 28023 1726853609.67544: in VariableManager get_vars() 28023 1726853609.67553: done with get_vars() 28023 1726853609.67554: filtering new block on tags 28023 1726853609.67574: done filtering new block on tags 28023 1726853609.67577: in VariableManager get_vars() 28023 1726853609.67644: done with get_vars() 28023 1726853609.67646: filtering new block on tags 28023 1726853609.67660: done filtering new block on tags 28023 1726853609.67663: in VariableManager get_vars() 28023 1726853609.67678: done with get_vars() 28023 1726853609.67680: filtering new block on tags 28023 1726853609.67692: done filtering new block on tags 28023 1726853609.67694: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node3 28023 1726853609.67700: extending task lists for all hosts with included blocks 28023 1726853609.67745: done extending task lists 28023 1726853609.67746: done processing included files 28023 1726853609.67747: results queue empty 28023 1726853609.67747: checking for any_errors_fatal 28023 1726853609.67749: done checking for any_errors_fatal 28023 1726853609.67750: checking for max_fail_percentage 28023 1726853609.67751: done checking for max_fail_percentage 28023 1726853609.67751: checking to see if all hosts have failed and the running result is not ok 28023 1726853609.67752: done checking to see if all hosts have failed 28023 1726853609.67753: getting the remaining hosts for this loop 28023 1726853609.67754: done getting the remaining hosts for this loop 28023 1726853609.67756: getting the next task for host managed_node3 28023 1726853609.67760: done getting next task for host managed_node3 28023 1726853609.67762: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 28023 1726853609.67764: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853609.67766: getting variables 28023 1726853609.67767: in VariableManager get_vars() 28023 1726853609.67777: Calling all_inventory to load vars for managed_node3 28023 1726853609.67914: Calling groups_inventory to load vars for managed_node3 28023 1726853609.67917: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853609.67923: Calling all_plugins_play to load vars for managed_node3 28023 1726853609.67926: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853609.67929: Calling groups_plugins_play to load vars for managed_node3 28023 1726853609.68343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853609.68777: done with get_vars() 28023 1726853609.68787: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 13:33:29 -0400 (0:00:00.043) 0:00:01.772 ****** 28023 1726853609.68851: entering _queue_task() for managed_node3/setup 28023 1726853609.69639: worker is 1 (out of 1 available) 28023 1726853609.69656: exiting _queue_task() for managed_node3/setup 28023 1726853609.69666: done queuing things up, now waiting for results queue to drain 28023 1726853609.69668: waiting for pending results... 28023 1726853609.69824: running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 28023 1726853609.69931: in run() - task 02083763-bbaf-fdb6-dad7-0000000000d0 28023 1726853609.69948: variable 'ansible_search_path' from source: unknown 28023 1726853609.69959: variable 'ansible_search_path' from source: unknown 28023 1726853609.70001: calling self._execute() 28023 1726853609.70081: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853609.70093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853609.70107: variable 'omit' from source: magic vars 28023 1726853609.70625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853609.72628: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853609.72783: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853609.72790: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853609.72796: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853609.72826: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853609.72913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853609.72942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853609.72972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853609.73017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853609.73036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853609.73213: variable 'ansible_facts' from source: unknown 28023 1726853609.73294: variable 'network_test_required_facts' from source: task vars 28023 1726853609.73337: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 28023 1726853609.73346: variable 'omit' from source: magic vars 28023 1726853609.73388: variable 'omit' from source: magic vars 28023 1726853609.73421: variable 'omit' from source: magic vars 28023 1726853609.73458: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853609.73549: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853609.73552: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853609.73556: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853609.73559: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853609.73579: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853609.73587: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853609.73594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853609.73692: Set connection var ansible_shell_type to sh 28023 1726853609.73716: Set connection var ansible_shell_executable to /bin/sh 28023 1726853609.73726: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853609.73735: Set connection var ansible_connection to ssh 28023 1726853609.73744: Set connection var ansible_pipelining to False 28023 1726853609.73752: Set connection var ansible_timeout to 10 28023 1726853609.73874: variable 'ansible_shell_executable' from source: unknown 28023 1726853609.73878: variable 'ansible_connection' from source: unknown 28023 1726853609.73880: variable 'ansible_module_compression' from source: unknown 28023 1726853609.73882: variable 'ansible_shell_type' from source: unknown 28023 1726853609.73884: variable 'ansible_shell_executable' from source: unknown 28023 1726853609.73886: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853609.73888: variable 'ansible_pipelining' from source: unknown 28023 1726853609.73890: variable 'ansible_timeout' from source: unknown 28023 1726853609.73892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853609.73994: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28023 1726853609.74008: variable 'omit' from source: magic vars 28023 1726853609.74018: starting attempt loop 28023 1726853609.74024: running the handler 28023 1726853609.74041: _low_level_execute_command(): starting 28023 1726853609.74052: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853609.74790: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853609.74863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853609.74894: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853609.74920: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853609.74937: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853609.75386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28023 1726853609.77391: stdout chunk (state=3): >>>/root <<< 28023 1726853609.77589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853609.77627: stderr chunk (state=3): >>><<< 28023 1726853609.77636: stdout chunk (state=3): >>><<< 28023 1726853609.77673: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 28023 1726853609.77781: _low_level_execute_command(): starting 28023 1726853609.77792: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853609.7776623-28124-179999548186407 `" && echo ansible-tmp-1726853609.7776623-28124-179999548186407="` echo /root/.ansible/tmp/ansible-tmp-1726853609.7776623-28124-179999548186407 `" ) && sleep 0' 28023 1726853609.78946: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853609.78993: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 28023 1726853609.79004: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853609.79272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853609.79312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28023 1726853609.81678: stdout chunk (state=3): >>>ansible-tmp-1726853609.7776623-28124-179999548186407=/root/.ansible/tmp/ansible-tmp-1726853609.7776623-28124-179999548186407 <<< 28023 1726853609.81776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853609.81808: stderr chunk (state=3): >>><<< 28023 1726853609.81816: stdout chunk (state=3): >>><<< 28023 1726853609.81841: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853609.7776623-28124-179999548186407=/root/.ansible/tmp/ansible-tmp-1726853609.7776623-28124-179999548186407 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 28023 1726853609.82277: variable 'ansible_module_compression' from source: unknown 28023 1726853609.82280: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 28023 1726853609.82282: variable 'ansible_facts' from source: unknown 28023 1726853609.82647: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853609.7776623-28124-179999548186407/AnsiballZ_setup.py 28023 1726853609.82960: Sending initial data 28023 1726853609.82969: Sent initial data (154 bytes) 28023 1726853609.84262: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853609.84394: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853609.84412: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853609.84437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853609.84700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853609.86555: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853609.86609: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853609.86672: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmp8gcalgor /root/.ansible/tmp/ansible-tmp-1726853609.7776623-28124-179999548186407/AnsiballZ_setup.py <<< 28023 1726853609.86685: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853609.7776623-28124-179999548186407/AnsiballZ_setup.py" <<< 28023 1726853609.86889: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmp8gcalgor" to remote "/root/.ansible/tmp/ansible-tmp-1726853609.7776623-28124-179999548186407/AnsiballZ_setup.py" <<< 28023 1726853609.86986: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853609.7776623-28124-179999548186407/AnsiballZ_setup.py" <<< 28023 1726853609.90181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853609.90194: stdout chunk (state=3): >>><<< 28023 1726853609.90388: stderr chunk (state=3): >>><<< 28023 1726853609.90392: done transferring module to remote 28023 1726853609.90394: _low_level_execute_command(): starting 28023 1726853609.90397: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853609.7776623-28124-179999548186407/ /root/.ansible/tmp/ansible-tmp-1726853609.7776623-28124-179999548186407/AnsiballZ_setup.py && sleep 0' 28023 1726853609.91892: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853609.91959: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853609.92058: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853609.92119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28023 1726853609.94276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853609.94308: stderr chunk (state=3): >>><<< 28023 1726853609.94324: stdout chunk (state=3): >>><<< 28023 1726853609.94367: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 28023 1726853609.94452: _low_level_execute_command(): starting 28023 1726853609.94463: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853609.7776623-28124-179999548186407/AnsiballZ_setup.py && sleep 0' 28023 1726853609.95137: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853609.95183: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853609.95205: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853609.95221: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853609.95463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28023 1726853609.97938: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 28023 1726853609.97992: stdout chunk (state=3): >>>import _imp # builtin <<< 28023 1726853609.98013: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 28023 1726853609.98087: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 28023 1726853609.98108: stdout chunk (state=3): >>>import 'posix' # <<< 28023 1726853609.98168: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 28023 1726853609.98194: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 28023 1726853609.98244: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 28023 1726853609.98389: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # <<< 28023 1726853609.98444: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1f684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1f37b30> <<< 28023 1726853609.98877: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1f6aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 28023 1726853609.98905: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1d5d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1d5dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 28023 1726853609.99242: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 28023 1726853609.99267: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 28023 1726853609.99295: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 28023 1726853609.99342: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 28023 1726853609.99365: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 28023 1726853609.99394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1d9bec0> <<< 28023 1726853609.99413: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 28023 1726853609.99462: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1d9bf80> <<< 28023 1726853609.99480: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 28023 1726853609.99585: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # <<< 28023 1726853609.99625: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1dd3830> <<< 28023 1726853609.99676: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1dd3ec0> import '_collections' # <<< 28023 1726853609.99764: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1db3b60> import '_functools' # <<< 28023 1726853609.99767: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1db12b0> <<< 28023 1726853609.99968: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1d99070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 28023 1726853609.99976: stdout chunk (state=3): >>>import '_sre' # <<< 28023 1726853609.99979: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 28023 1726853610.00046: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 28023 1726853610.00050: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 28023 1726853610.00085: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1df37d0> <<< 28023 1726853610.00174: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1df23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1db2150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1df0bc0> <<< 28023 1726853610.00179: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1e28890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1d982f0> <<< 28023 1726853610.00181: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 28023 1726853610.00324: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1e28d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1e28bf0> <<< 28023 1726853610.00327: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1e28fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1d96e10> <<< 28023 1726853610.00330: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 28023 1726853610.00457: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1e29670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1e29370> import 'importlib.machinery' # <<< 28023 1726853610.00520: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1e2a540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 28023 1726853610.00533: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 28023 1726853610.00536: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 28023 1726853610.00658: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1e40740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1e41e20> <<< 28023 1726853610.00686: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1e42cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1e432f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1e42210> <<< 28023 1726853610.00703: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 28023 1726853610.00746: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853610.00759: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1e43d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1e434a0> <<< 28023 1726853610.00811: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1e2a4b0> <<< 28023 1726853610.00936: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 28023 1726853610.00940: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 28023 1726853610.00988: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1b37c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1b607a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1b60500> <<< 28023 1726853610.01000: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1b607d0> <<< 28023 1726853610.01090: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 28023 1726853610.01094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 28023 1726853610.01110: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853610.01288: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1b61100> <<< 28023 1726853610.01405: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1b61af0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1b609b0> <<< 28023 1726853610.01442: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1b35df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 28023 1726853610.01519: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1b62f00> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1b61c40> <<< 28023 1726853610.01549: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1e2ac60> <<< 28023 1726853610.01553: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 28023 1726853610.01704: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 28023 1726853610.01707: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 28023 1726853610.01736: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1b8b230> <<< 28023 1726853610.01756: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 28023 1726853610.01780: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 28023 1726853610.01830: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 28023 1726853610.01979: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1baf5f0> <<< 28023 1726853610.01988: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 28023 1726853610.01991: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 28023 1726853610.01993: stdout chunk (state=3): >>>import 'ntpath' # <<< 28023 1726853610.02065: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1c10380> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 28023 1726853610.02074: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 28023 1726853610.02280: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 28023 1726853610.02354: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1c12ae0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1c104a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1bd1370> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1529430> <<< 28023 1726853610.02368: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1bae3f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1b63e00> <<< 28023 1726853610.02533: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 28023 1726853610.02558: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa9d1bae750> <<< 28023 1726853610.02782: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_2rmix2jf/ansible_setup_payload.zip' # zipimport: zlib available <<< 28023 1726853610.02911: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.02940: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 28023 1726853610.02965: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 28023 1726853610.02995: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 28023 1726853610.03081: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 28023 1726853610.03113: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1593170> import '_typing' # <<< 28023 1726853610.03312: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1572060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d15711c0> # zipimport: zlib available <<< 28023 1726853610.03339: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 28023 1726853610.03405: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28023 1726853610.03409: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 28023 1726853610.05005: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.05982: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1591040> <<< 28023 1726853610.06013: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 28023 1726853610.06092: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 28023 1726853610.06097: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d15c2b10> <<< 28023 1726853610.06151: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d15c28a0> <<< 28023 1726853610.06182: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d15c21b0> <<< 28023 1726853610.06317: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 28023 1726853610.06320: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d15c2ba0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1593e00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d15c3860> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d15c39e0> <<< 28023 1726853610.06322: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 28023 1726853610.06374: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 28023 1726853610.06388: stdout chunk (state=3): >>>import '_locale' # <<< 28023 1726853610.06433: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d15c3ef0> <<< 28023 1726853610.06459: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 28023 1726853610.06485: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 28023 1726853610.06547: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d142dc70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853610.06567: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d142f890> <<< 28023 1726853610.06694: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1430290> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1431430> <<< 28023 1726853610.06712: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 28023 1726853610.06750: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 28023 1726853610.06781: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 28023 1726853610.06784: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 28023 1726853610.06835: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1433f20> <<< 28023 1726853610.06867: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1b62e70> <<< 28023 1726853610.06890: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d14321e0> <<< 28023 1726853610.06996: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 28023 1726853610.07107: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 28023 1726853610.07132: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 28023 1726853610.07151: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d143bdd0> import '_tokenize' # <<< 28023 1726853610.07230: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d143a8a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d143a600> <<< 28023 1726853610.07250: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 28023 1726853610.07334: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d143ab70> <<< 28023 1726853610.07356: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d14326f0> <<< 28023 1726853610.07484: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d147ff80> <<< 28023 1726853610.07512: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d147ffe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853610.07541: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1481b50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1481910> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 28023 1726853610.07648: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 28023 1726853610.07651: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853610.07654: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1483fe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d14821b0> <<< 28023 1726853610.07656: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 28023 1726853610.07790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1487770> <<< 28023 1726853610.07974: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1484140> <<< 28023 1726853610.07978: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1488530> <<< 28023 1726853610.07996: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1488770> <<< 28023 1726853610.08044: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853610.08064: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1488aa0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1480260> <<< 28023 1726853610.08086: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 28023 1726853610.08101: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 28023 1726853610.08126: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 28023 1726853610.08295: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d13140e0> <<< 28023 1726853610.08324: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853610.08405: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1315310> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d148a870> <<< 28023 1726853610.08412: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d148bbf0> <<< 28023 1726853610.08420: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d148a4e0> # zipimport: zlib available <<< 28023 1726853610.08424: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 28023 1726853610.08433: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.08530: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.08621: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.08639: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 28023 1726853610.08795: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available <<< 28023 1726853610.08914: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.09473: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.10013: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 28023 1726853610.10034: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 28023 1726853610.10055: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 28023 1726853610.10065: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 28023 1726853610.10124: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1319430> <<< 28023 1726853610.10208: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 28023 1726853610.10299: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d131a1b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1488110> import 'ansible.module_utils.compat.selinux' # <<< 28023 1726853610.10325: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.10501: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 28023 1726853610.10505: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.10507: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.10705: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 28023 1726853610.10708: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d131a360> <<< 28023 1726853610.10715: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.11138: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.11576: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.11646: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.11722: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 28023 1726853610.11776: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.11780: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.11892: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available <<< 28023 1726853610.11976: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 28023 1726853610.11983: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28023 1726853610.12018: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 28023 1726853610.12040: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.12176: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 28023 1726853610.12179: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.12316: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.12545: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 28023 1726853610.12611: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 28023 1726853610.12785: stdout chunk (state=3): >>>import '_ast' # <<< 28023 1726853610.12849: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d131b440> # zipimport: zlib available # zipimport: zlib available <<< 28023 1726853610.12853: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 28023 1726853610.12856: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 28023 1726853610.12875: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.12923: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.13091: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 28023 1726853610.13111: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.13378: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 28023 1726853610.13382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 28023 1726853610.13384: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1326090> <<< 28023 1726853610.13387: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d13216a0> <<< 28023 1726853610.13389: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 28023 1726853610.13403: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 28023 1726853610.13466: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.13524: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.13548: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.13691: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 28023 1726853610.13716: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 28023 1726853610.13744: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 28023 1726853610.13763: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 28023 1726853610.13814: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d140e900> <<< 28023 1726853610.13852: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d15ee5d0> <<< 28023 1726853610.13938: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1326060> <<< 28023 1726853610.13964: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d13183b0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 28023 1726853610.14104: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 28023 1726853610.14121: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.14173: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.14235: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.14263: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.14277: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.14314: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.14353: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.14389: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.14431: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 28023 1726853610.14476: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.14511: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.14584: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.14801: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.14809: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 28023 1726853610.14825: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.14999: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.15039: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.15098: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 28023 1726853610.15147: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 28023 1726853610.15291: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 28023 1726853610.15304: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 28023 1726853610.15336: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d13b5fd0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 28023 1726853610.15348: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d0f5bf20> <<< 28023 1726853610.15385: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853610.15400: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d0f702c0> <<< 28023 1726853610.15453: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d139e840> <<< 28023 1726853610.15469: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d13b6b40> <<< 28023 1726853610.15499: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d13b46b0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d13b47a0> <<< 28023 1726853610.15731: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 28023 1726853610.15735: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d0f73290> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d0f72b40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d0f72d20> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d0f71f70> <<< 28023 1726853610.15738: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 28023 1726853610.15866: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 28023 1726853610.15891: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d0f73410> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 28023 1726853610.15924: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 28023 1726853610.15959: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d0fbdf40> <<< 28023 1726853610.15987: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d0f73f20> <<< 28023 1726853610.16021: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d13b4380> <<< 28023 1726853610.16048: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 28023 1726853610.16190: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 28023 1726853610.16216: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 28023 1726853610.16275: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.16319: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 28023 1726853610.16359: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 28023 1726853610.16373: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.16624: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.16627: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available <<< 28023 1726853610.16630: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 28023 1726853610.16685: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.16742: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.16798: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.16858: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 28023 1726853610.16880: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 28023 1726853610.17361: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.17796: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 28023 1726853610.17852: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.17906: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.17942: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.17988: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 28023 1726853610.18089: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 28023 1726853610.18115: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.18180: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 28023 1726853610.18407: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.18410: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available <<< 28023 1726853610.18491: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 28023 1726853610.18513: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d0fbf500> <<< 28023 1726853610.18538: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 28023 1726853610.18565: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 28023 1726853610.18686: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d0fbea50> <<< 28023 1726853610.18706: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 28023 1726853610.18892: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 28023 1726853610.18928: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.19023: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 28023 1726853610.19091: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.19168: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 28023 1726853610.19187: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.19214: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.19267: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 28023 1726853610.19314: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 28023 1726853610.19390: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853610.19455: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d0ffe120> <<< 28023 1726853610.19645: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d0fedf40> <<< 28023 1726853610.19666: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 28023 1726853610.19714: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.19892: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 28023 1726853610.19947: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.20056: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.20204: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 28023 1726853610.20221: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.20256: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.20297: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 28023 1726853610.20309: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.20341: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.20393: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 28023 1726853610.20428: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853610.20458: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d10119d0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d0ffdf10> import 'ansible.module_utils.facts.system.user' # <<< 28023 1726853610.20491: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28023 1726853610.20596: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 28023 1726853610.20747: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.20896: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 28023 1726853610.21098: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28023 1726853610.21144: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.21191: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 28023 1726853610.21214: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 28023 1726853610.21235: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.21384: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.21405: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.21528: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 28023 1726853610.21544: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 28023 1726853610.21658: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.21887: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 28023 1726853610.21907: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28023 1726853610.22410: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.22919: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 28023 1726853610.22940: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.23040: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.23147: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 28023 1726853610.23392: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 28023 1726853610.23506: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.23661: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 28023 1726853610.23693: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 28023 1726853610.23743: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.23779: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 28023 1726853610.23799: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.23889: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.23989: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.24363: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.24459: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 28023 1726853610.24462: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.24531: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 28023 1726853610.24578: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 28023 1726853610.24636: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.24732: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 28023 1726853610.24779: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.24783: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 28023 1726853610.24785: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.24828: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.24899: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 28023 1726853610.24903: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.24961: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.25027: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 28023 1726853610.25031: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.25290: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.25557: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 28023 1726853610.25559: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.25862: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available <<< 28023 1726853610.25888: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 28023 1726853610.25948: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28023 1726853610.25993: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 28023 1726853610.25995: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.26112: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.26225: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 28023 1726853610.26250: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.26277: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 28023 1726853610.26370: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.26389: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 28023 1726853610.26414: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.26454: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.26472: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.26603: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28023 1726853610.26702: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.26799: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 28023 1726853610.26820: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 28023 1726853610.26901: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.26968: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 28023 1726853610.27305: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.27598: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 28023 1726853610.27659: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.27728: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 28023 1726853610.27857: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # <<< 28023 1726853610.27859: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.27991: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.28112: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 28023 1726853610.28239: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.28335: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 28023 1726853610.28357: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # <<< 28023 1726853610.28379: stdout chunk (state=3): >>>import 'ansible.module_utils.facts' # <<< 28023 1726853610.28429: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.29125: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 28023 1726853610.29129: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 28023 1726853610.29198: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 28023 1726853610.29223: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 28023 1726853610.29236: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d0e12120> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d0e130b0> <<< 28023 1726853610.29282: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d0e0be60> <<< 28023 1726853610.30046: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G<<< 28023 1726853610.30076: stdout chunk (state=3): >>>-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "33", "second": "30", "epoch": "1726853610", "epoch_int": "1726853610", "date": "2024-09-20", "time": "13:33:30", "iso8601_micro": "2024-09-20T17:33:30.298647Z", "iso8601": "2024-09-20T17:33:30Z", "iso8601_basic": "20240920T133330298647", "iso8601_basic_short": "20240920T133330", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 28023 1726853610.30903: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 28023 1726853610.30914: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os <<< 28023 1726853610.31075: stdout chunk (state=3): >>># cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator <<< 28023 1726853610.31114: stdout chunk (state=3): >>># cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # dest<<< 28023 1726853610.31164: stdout chunk (state=3): >>>roy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_util<<< 28023 1726853610.31189: stdout chunk (state=3): >>>s.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 28023 1726853610.31676: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 <<< 28023 1726853610.31687: stdout chunk (state=3): >>># destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 28023 1726853610.31776: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath <<< 28023 1726853610.31792: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 28023 1726853610.31833: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 28023 1726853610.31881: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 28023 1726853610.31911: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 28023 1726853610.31952: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 28023 1726853610.32001: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process <<< 28023 1726853610.32022: stdout chunk (state=3): >>># destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction <<< 28023 1726853610.32065: stdout chunk (state=3): >>># destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl <<< 28023 1726853610.32128: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json <<< 28023 1726853610.32132: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 28023 1726853610.32165: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 28023 1726853610.32216: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep <<< 28023 1726853610.32258: stdout chunk (state=3): >>># cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 28023 1726853610.32262: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 28023 1726853610.32284: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 28023 1726853610.32321: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg <<< 28023 1726853610.32400: stdout chunk (state=3): >>># cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 28023 1726853610.32569: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 28023 1726853610.32617: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 28023 1726853610.32646: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 28023 1726853610.32705: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 28023 1726853610.32716: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 28023 1726853610.32745: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 28023 1726853610.32863: stdout chunk (state=3): >>># destroy codecs <<< 28023 1726853610.32914: stdout chunk (state=3): >>># destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref<<< 28023 1726853610.32917: stdout chunk (state=3): >>> # destroy collections # destroy threading # destroy atexit # destroy _warnings<<< 28023 1726853610.32948: stdout chunk (state=3): >>> # destroy math # destroy _bisect # destroy time<<< 28023 1726853610.32995: stdout chunk (state=3): >>> # destroy _random <<< 28023 1726853610.32998: stdout chunk (state=3): >>># destroy _weakref<<< 28023 1726853610.33041: stdout chunk (state=3): >>> # destroy _hashlib # destroy _operator # destroy _sre<<< 28023 1726853610.33076: stdout chunk (state=3): >>> # destroy _string # destroy re # destroy itertools<<< 28023 1726853610.33098: stdout chunk (state=3): >>> <<< 28023 1726853610.33131: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins<<< 28023 1726853610.33157: stdout chunk (state=3): >>> # destroy _thread # clear sys.audit hooks<<< 28023 1726853610.33285: stdout chunk (state=3): >>> <<< 28023 1726853610.33781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853610.33784: stdout chunk (state=3): >>><<< 28023 1726853610.33786: stderr chunk (state=3): >>><<< 28023 1726853610.34200: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1f684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1f37b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1f6aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1d5d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1d5dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1d9bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1d9bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1dd3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1dd3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1db3b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1db12b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1d99070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1df37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1df23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1db2150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1df0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1e28890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1d982f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1e28d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1e28bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1e28fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1d96e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1e29670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1e29370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1e2a540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1e40740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1e41e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1e42cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1e432f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1e42210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1e43d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1e434a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1e2a4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1b37c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1b607a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1b60500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1b607d0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1b61100> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1b61af0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1b609b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1b35df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1b62f00> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1b61c40> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1e2ac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1b8b230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1baf5f0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1c10380> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1c12ae0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1c104a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1bd1370> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1529430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1bae3f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1b63e00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa9d1bae750> # zipimport: found 103 names in '/tmp/ansible_setup_payload_2rmix2jf/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1593170> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1572060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d15711c0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1591040> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d15c2b10> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d15c28a0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d15c21b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d15c2ba0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1593e00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d15c3860> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d15c39e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d15c3ef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d142dc70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d142f890> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1430290> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1431430> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1433f20> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1b62e70> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d14321e0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d143bdd0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d143a8a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d143a600> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d143ab70> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d14326f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d147ff80> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d147ffe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1481b50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1481910> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1483fe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d14821b0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1487770> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1484140> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1488530> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1488770> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1488aa0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1480260> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d13140e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1315310> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d148a870> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d148bbf0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d148a4e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1319430> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d131a1b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1488110> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d131a360> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d131b440> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d1326090> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d13216a0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d140e900> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d15ee5d0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d1326060> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d13183b0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d13b5fd0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d0f5bf20> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d0f702c0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d139e840> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d13b6b40> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d13b46b0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d13b47a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d0f73290> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d0f72b40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d0f72d20> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d0f71f70> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d0f73410> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d0fbdf40> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d0f73f20> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d13b4380> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d0fbf500> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d0fbea50> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d0ffe120> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d0fedf40> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d10119d0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d0ffdf10> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9d0e12120> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d0e130b0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9d0e0be60> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "33", "second": "30", "epoch": "1726853610", "epoch_int": "1726853610", "date": "2024-09-20", "time": "13:33:30", "iso8601_micro": "2024-09-20T17:33:30.298647Z", "iso8601": "2024-09-20T17:33:30Z", "iso8601_basic": "20240920T133330298647", "iso8601_basic_short": "20240920T133330", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 28023 1726853610.35116: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853609.7776623-28124-179999548186407/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853610.35119: _low_level_execute_command(): starting 28023 1726853610.35122: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853609.7776623-28124-179999548186407/ > /dev/null 2>&1 && sleep 0' 28023 1726853610.35490: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853610.35547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853610.35550: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853610.35553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853610.35555: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853610.35557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853610.35559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853610.35598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853610.35612: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853610.35617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853610.35711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28023 1726853610.38474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853610.38479: stdout chunk (state=3): >>><<< 28023 1726853610.38677: stderr chunk (state=3): >>><<< 28023 1726853610.38682: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 28023 1726853610.38686: handler run complete 28023 1726853610.38689: variable 'ansible_facts' from source: unknown 28023 1726853610.38692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853610.38750: variable 'ansible_facts' from source: unknown 28023 1726853610.38813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853610.38869: attempt loop complete, returning result 28023 1726853610.38874: _execute() done 28023 1726853610.38879: dumping result to json 28023 1726853610.38901: done dumping result, returning 28023 1726853610.38910: done running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [02083763-bbaf-fdb6-dad7-0000000000d0] 28023 1726853610.38919: sending task result for task 02083763-bbaf-fdb6-dad7-0000000000d0 28023 1726853610.39080: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000000d0 28023 1726853610.39083: WORKER PROCESS EXITING ok: [managed_node3] 28023 1726853610.39258: no more pending results, returning what we have 28023 1726853610.39262: results queue empty 28023 1726853610.39262: checking for any_errors_fatal 28023 1726853610.39264: done checking for any_errors_fatal 28023 1726853610.39265: checking for max_fail_percentage 28023 1726853610.39267: done checking for max_fail_percentage 28023 1726853610.39268: checking to see if all hosts have failed and the running result is not ok 28023 1726853610.39269: done checking to see if all hosts have failed 28023 1726853610.39270: getting the remaining hosts for this loop 28023 1726853610.39273: done getting the remaining hosts for this loop 28023 1726853610.39277: getting the next task for host managed_node3 28023 1726853610.39286: done getting next task for host managed_node3 28023 1726853610.39289: ^ task is: TASK: Check if system is ostree 28023 1726853610.39292: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853610.39296: getting variables 28023 1726853610.39297: in VariableManager get_vars() 28023 1726853610.39444: Calling all_inventory to load vars for managed_node3 28023 1726853610.39448: Calling groups_inventory to load vars for managed_node3 28023 1726853610.39451: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853610.39466: Calling all_plugins_play to load vars for managed_node3 28023 1726853610.39469: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853610.39478: Calling groups_plugins_play to load vars for managed_node3 28023 1726853610.39866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853610.40108: done with get_vars() 28023 1726853610.40118: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 13:33:30 -0400 (0:00:00.713) 0:00:02.486 ****** 28023 1726853610.40217: entering _queue_task() for managed_node3/stat 28023 1726853610.40562: worker is 1 (out of 1 available) 28023 1726853610.40576: exiting _queue_task() for managed_node3/stat 28023 1726853610.40585: done queuing things up, now waiting for results queue to drain 28023 1726853610.40587: waiting for pending results... 28023 1726853610.40801: running TaskExecutor() for managed_node3/TASK: Check if system is ostree 28023 1726853610.40917: in run() - task 02083763-bbaf-fdb6-dad7-0000000000d2 28023 1726853610.40982: variable 'ansible_search_path' from source: unknown 28023 1726853610.40985: variable 'ansible_search_path' from source: unknown 28023 1726853610.40999: calling self._execute() 28023 1726853610.41089: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853610.41102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853610.41116: variable 'omit' from source: magic vars 28023 1726853610.42131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853610.42477: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853610.42500: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853610.42549: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853610.42613: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853610.42715: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853610.42744: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853610.42793: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853610.42824: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853610.42977: Evaluated conditional (not __network_is_ostree is defined): True 28023 1726853610.42995: variable 'omit' from source: magic vars 28023 1726853610.43084: variable 'omit' from source: magic vars 28023 1726853610.43088: variable 'omit' from source: magic vars 28023 1726853610.43148: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853610.43227: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853610.43250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853610.43276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853610.43320: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853610.43335: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853610.43342: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853610.43348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853610.43540: Set connection var ansible_shell_type to sh 28023 1726853610.43632: Set connection var ansible_shell_executable to /bin/sh 28023 1726853610.43634: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853610.43636: Set connection var ansible_connection to ssh 28023 1726853610.43641: Set connection var ansible_pipelining to False 28023 1726853610.43645: Set connection var ansible_timeout to 10 28023 1726853610.43647: variable 'ansible_shell_executable' from source: unknown 28023 1726853610.43649: variable 'ansible_connection' from source: unknown 28023 1726853610.43651: variable 'ansible_module_compression' from source: unknown 28023 1726853610.43652: variable 'ansible_shell_type' from source: unknown 28023 1726853610.43656: variable 'ansible_shell_executable' from source: unknown 28023 1726853610.43658: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853610.43660: variable 'ansible_pipelining' from source: unknown 28023 1726853610.43662: variable 'ansible_timeout' from source: unknown 28023 1726853610.43663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853610.43819: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28023 1726853610.43844: variable 'omit' from source: magic vars 28023 1726853610.43861: starting attempt loop 28023 1726853610.43873: running the handler 28023 1726853610.43892: _low_level_execute_command(): starting 28023 1726853610.43956: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853610.44738: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853610.44803: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853610.44847: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853610.44962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28023 1726853610.47520: stdout chunk (state=3): >>>/root <<< 28023 1726853610.47601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853610.47776: stderr chunk (state=3): >>><<< 28023 1726853610.47780: stdout chunk (state=3): >>><<< 28023 1726853610.47783: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 28023 1726853610.47793: _low_level_execute_command(): starting 28023 1726853610.47795: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853610.477357-28160-46241368556790 `" && echo ansible-tmp-1726853610.477357-28160-46241368556790="` echo /root/.ansible/tmp/ansible-tmp-1726853610.477357-28160-46241368556790 `" ) && sleep 0' 28023 1726853610.49104: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853610.49111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853610.49123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853610.49138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853610.49152: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853610.49186: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853610.49248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853610.49489: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853610.49507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853610.49610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28023 1726853610.52617: stdout chunk (state=3): >>>ansible-tmp-1726853610.477357-28160-46241368556790=/root/.ansible/tmp/ansible-tmp-1726853610.477357-28160-46241368556790 <<< 28023 1726853610.52700: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853610.52703: stdout chunk (state=3): >>><<< 28023 1726853610.52711: stderr chunk (state=3): >>><<< 28023 1726853610.52732: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853610.477357-28160-46241368556790=/root/.ansible/tmp/ansible-tmp-1726853610.477357-28160-46241368556790 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 28023 1726853610.52787: variable 'ansible_module_compression' from source: unknown 28023 1726853610.52844: ANSIBALLZ: Using lock for stat 28023 1726853610.52847: ANSIBALLZ: Acquiring lock 28023 1726853610.52849: ANSIBALLZ: Lock acquired: 139729396668064 28023 1726853610.52978: ANSIBALLZ: Creating module 28023 1726853610.72881: ANSIBALLZ: Writing module into payload 28023 1726853610.73003: ANSIBALLZ: Writing module 28023 1726853610.73024: ANSIBALLZ: Renaming module 28023 1726853610.73039: ANSIBALLZ: Done creating module 28023 1726853610.73070: variable 'ansible_facts' from source: unknown 28023 1726853610.73216: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853610.477357-28160-46241368556790/AnsiballZ_stat.py 28023 1726853610.73393: Sending initial data 28023 1726853610.73397: Sent initial data (151 bytes) 28023 1726853610.74048: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853610.74129: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853610.74168: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853610.74194: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853610.74208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853610.74328: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28023 1726853610.76681: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 28023 1726853610.76907: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853610.76911: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853610.76982: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpdmts_mth /root/.ansible/tmp/ansible-tmp-1726853610.477357-28160-46241368556790/AnsiballZ_stat.py <<< 28023 1726853610.76985: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853610.477357-28160-46241368556790/AnsiballZ_stat.py" <<< 28023 1726853610.77041: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpdmts_mth" to remote "/root/.ansible/tmp/ansible-tmp-1726853610.477357-28160-46241368556790/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853610.477357-28160-46241368556790/AnsiballZ_stat.py" <<< 28023 1726853610.78323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853610.78405: stderr chunk (state=3): >>><<< 28023 1726853610.78425: stdout chunk (state=3): >>><<< 28023 1726853610.78497: done transferring module to remote 28023 1726853610.78515: _low_level_execute_command(): starting 28023 1726853610.78533: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853610.477357-28160-46241368556790/ /root/.ansible/tmp/ansible-tmp-1726853610.477357-28160-46241368556790/AnsiballZ_stat.py && sleep 0' 28023 1726853610.79374: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853610.79386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853610.79407: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28023 1726853610.79489: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853610.79520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853610.79539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853610.79564: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853610.79698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28023 1726853610.82357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853610.82411: stderr chunk (state=3): >>><<< 28023 1726853610.82493: stdout chunk (state=3): >>><<< 28023 1726853610.82510: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 28023 1726853610.82519: _low_level_execute_command(): starting 28023 1726853610.82529: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853610.477357-28160-46241368556790/AnsiballZ_stat.py && sleep 0' 28023 1726853610.83383: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853610.83397: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853610.83409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853610.83425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853610.83484: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853610.83531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853610.83553: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853610.83581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853610.83681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28023 1726853610.86890: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 28023 1726853610.86956: stdout chunk (state=3): >>>import _imp # builtin <<< 28023 1726853610.87239: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 28023 1726853610.87290: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 28023 1726853610.87331: stdout chunk (state=3): >>>import '_codecs' # <<< 28023 1726853610.87452: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 28023 1726853610.87523: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88900184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888ffe7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 28023 1726853610.87552: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f889001aa50> import '_signal' # <<< 28023 1726853610.87609: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 28023 1726853610.87612: stdout chunk (state=3): >>>import 'io' # <<< 28023 1726853610.87684: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 28023 1726853610.87862: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # <<< 28023 1726853610.87913: stdout chunk (state=3): >>>import 'posixpath' # import 'os' # <<< 28023 1726853610.88069: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 28023 1726853610.88078: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 28023 1726853610.88253: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fe2d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fe2dfa0> import 'site' # <<< 28023 1726853610.88323: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 28023 1726853610.88679: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 28023 1726853610.88734: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 28023 1726853610.88825: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 28023 1726853610.88926: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fe6bec0> <<< 28023 1726853610.88976: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 28023 1726853610.88989: stdout chunk (state=3): >>>import '_operator' # <<< 28023 1726853610.89096: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fe6bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 28023 1726853610.89134: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 28023 1726853610.89220: stdout chunk (state=3): >>>import 'itertools' # <<< 28023 1726853610.89311: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fea3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 28023 1726853610.89335: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fea3ec0> import '_collections' # <<< 28023 1726853610.89429: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fe83b60> import '_functools' # <<< 28023 1726853610.89499: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fe812b0> <<< 28023 1726853610.89869: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fe69070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fec37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fec23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fe82150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fec0bc0> <<< 28023 1726853610.89909: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fef8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fe682f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 28023 1726853610.90008: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888fef8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fef8bf0> <<< 28023 1726853610.90149: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888fef8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fe66e10> <<< 28023 1726853610.90196: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 28023 1726853610.90264: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fef9670> <<< 28023 1726853610.90288: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fef9370> import 'importlib.machinery' # <<< 28023 1726853610.90391: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fefa540> <<< 28023 1726853610.90447: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 28023 1726853610.90450: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 28023 1726853610.90676: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888ff10740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888ff11e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 28023 1726853610.90679: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 28023 1726853610.90682: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 28023 1726853610.90685: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 28023 1726853610.90879: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888ff12cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888ff132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888ff12210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 28023 1726853610.90882: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853610.90885: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888ff13d70> <<< 28023 1726853610.90887: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888ff134a0> <<< 28023 1726853610.91015: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fefa4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 28023 1726853610.91030: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 28023 1726853610.91202: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888fcdfc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888fd087a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fd08500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888fd087d0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 28023 1726853610.91315: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853610.91473: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888fd09100> <<< 28023 1726853610.91660: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853610.91664: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853610.91674: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888fd09af0> <<< 28023 1726853610.91686: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fd089b0> <<< 28023 1726853610.91711: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fcdddf0> <<< 28023 1726853610.91735: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 28023 1726853610.91774: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 28023 1726853610.91796: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 28023 1726853610.91824: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 28023 1726853610.91837: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fd0af00> <<< 28023 1726853610.91876: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fd09c40> <<< 28023 1726853610.91895: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fefac60> <<< 28023 1726853610.91935: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 28023 1726853610.92188: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fd2f230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 28023 1726853610.92204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 28023 1726853610.92224: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 28023 1726853610.92261: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 28023 1726853610.92319: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fd57620> <<< 28023 1726853610.92362: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 28023 1726853610.92426: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 28023 1726853610.92514: stdout chunk (state=3): >>>import 'ntpath' # <<< 28023 1726853610.92548: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc'<<< 28023 1726853610.92560: stdout chunk (state=3): >>> import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fdb8380> <<< 28023 1726853610.92598: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 28023 1726853610.92645: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 28023 1726853610.92680: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 28023 1726853610.92739: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 28023 1726853610.92873: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fdbaae0> <<< 28023 1726853610.92990: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fdb84a0> <<< 28023 1726853610.93047: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fd793a0> <<< 28023 1726853610.93074: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 28023 1726853610.93093: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' <<< 28023 1726853610.93321: stdout chunk (state=3): >>>import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fbc1430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fd56420> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fd0be00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 28023 1726853610.93339: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f888fd56780> <<< 28023 1726853610.93668: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_mj0krpt2/ansible_stat_payload.zip' <<< 28023 1726853610.93673: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.93882: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.93913: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 28023 1726853610.93974: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 28023 1726853610.94015: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 28023 1726853610.94108: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 28023 1726853610.94143: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 28023 1726853610.94150: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 28023 1726853610.94184: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fc171d0> <<< 28023 1726853610.94193: stdout chunk (state=3): >>>import '_typing' # <<< 28023 1726853610.94460: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fbf60c0> <<< 28023 1726853610.94484: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fbf5220> # zipimport: zlib available <<< 28023 1726853610.94537: stdout chunk (state=3): >>>import 'ansible' # <<< 28023 1726853610.94556: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.94590: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.94602: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.94635: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 28023 1726853610.94650: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.97015: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853610.98747: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 28023 1726853610.98837: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fc15070> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 28023 1726853610.98842: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 28023 1726853610.98851: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 28023 1726853610.98889: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 28023 1726853610.98901: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 28023 1726853610.98944: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853610.98960: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888fc3eb10> <<< 28023 1726853610.99020: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fc3e8a0> <<< 28023 1726853610.99068: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fc3e1b0> <<< 28023 1726853610.99109: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 28023 1726853610.99112: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 28023 1726853610.99183: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fc3e600> <<< 28023 1726853610.99186: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fc17bf0> <<< 28023 1726853610.99295: stdout chunk (state=3): >>>import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888fc3f8c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888fc3fb00> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 28023 1726853610.99358: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 28023 1726853610.99404: stdout chunk (state=3): >>>import '_locale' # <<< 28023 1726853610.99456: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fc3ff80> <<< 28023 1726853610.99497: stdout chunk (state=3): >>>import 'pwd' # <<< 28023 1726853610.99501: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 28023 1726853610.99550: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 28023 1726853610.99601: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f515df0> <<< 28023 1726853610.99679: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f5179e0> <<< 28023 1726853610.99711: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py<<< 28023 1726853610.99932: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f518350> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f5194f0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 28023 1726853610.99965: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 28023 1726853611.00012: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f51bfe0> <<< 28023 1726853611.00083: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853611.00142: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f520350> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f51a2a0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 28023 1726853611.00249: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 28023 1726853611.00276: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 28023 1726853611.00310: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 28023 1726853611.00332: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 28023 1726853611.00344: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f523ec0> <<< 28023 1726853611.00369: stdout chunk (state=3): >>>import '_tokenize' # <<< 28023 1726853611.00460: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f5229c0> <<< 28023 1726853611.00463: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f522720> <<< 28023 1726853611.00493: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 28023 1726853611.00511: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 28023 1726853611.00634: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f522c60> <<< 28023 1726853611.00688: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f51a7b0><<< 28023 1726853611.00691: stdout chunk (state=3): >>> <<< 28023 1726853611.00727: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853611.00897: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f56bef0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f56c140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f56dc40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f56da00> <<< 28023 1726853611.00909: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 28023 1726853611.01099: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 28023 1726853611.01150: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853611.01294: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f570200> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f56e330> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 28023 1726853611.01306: stdout chunk (state=3): >>>import '_string' # <<< 28023 1726853611.01374: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f5739e0> <<< 28023 1726853611.01574: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f5703b0> <<< 28023 1726853611.01663: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853611.01667: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f574830> <<< 28023 1726853611.01714: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853611.01722: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f574a40> <<< 28023 1726853611.01800: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f574d40> <<< 28023 1726853611.01820: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f56c350> <<< 28023 1726853611.01844: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 28023 1726853611.01860: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 28023 1726853611.01895: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 28023 1726853611.01923: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 28023 1726853611.01963: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853611.02002: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f4004d0> <<< 28023 1726853611.02278: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f401820> <<< 28023 1726853611.02490: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f576c90> <<< 28023 1726853611.02493: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f522c00> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f5768a0> # zipimport: zlib available # zipimport: zlib available <<< 28023 1726853611.02496: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # <<< 28023 1726853611.02498: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.02567: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.02715: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.02926: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 28023 1726853611.03005: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.03136: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.04062: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.04933: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 28023 1726853611.04973: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 28023 1726853611.04984: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 28023 1726853611.04993: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 28023 1726853611.05025: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 28023 1726853611.05079: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 28023 1726853611.05159: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f4059a0> <<< 28023 1726853611.05276: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 28023 1726853611.05292: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 28023 1726853611.05314: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f406720> <<< 28023 1726853611.05340: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f400620> <<< 28023 1726853611.05404: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 28023 1726853611.05428: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.05453: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.05485: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 28023 1726853611.05581: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.05742: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.05976: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 28023 1726853611.05995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 28023 1726853611.06009: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f406420> <<< 28023 1726853611.06037: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.06789: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.07507: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.07621: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.07733: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 28023 1726853611.07991: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available <<< 28023 1726853611.08092: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 28023 1726853611.08129: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.08133: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.08143: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 28023 1726853611.08178: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.08231: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.08287: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 28023 1726853611.08300: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.08673: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.09029: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 28023 1726853611.09127: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 28023 1726853611.09146: stdout chunk (state=3): >>>import '_ast' # <<< 28023 1726853611.09263: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f407950> <<< 28023 1726853611.09280: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.09396: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.09505: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 28023 1726853611.09538: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 28023 1726853611.09566: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 28023 1726853611.09686: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28023 1726853611.09700: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 28023 1726853611.09715: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.09891: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28023 1726853611.09927: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.10034: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 28023 1726853611.10119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 28023 1726853611.10245: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853611.10257: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 28023 1726853611.10261: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f4122d0> <<< 28023 1726853611.10324: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f40d280> <<< 28023 1726853611.10369: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 28023 1726853611.10376: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 28023 1726853611.10405: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.10507: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.10607: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.10647: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.10719: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 28023 1726853611.10775: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 28023 1726853611.10800: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 28023 1726853611.10833: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 28023 1726853611.10920: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 28023 1726853611.10951: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 28023 1726853611.10975: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 28023 1726853611.11066: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f502a20> <<< 28023 1726853611.11128: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fc7a6f0> <<< 28023 1726853611.11237: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f407260> <<< 28023 1726853611.11255: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f889001b980> <<< 28023 1726853611.11263: stdout chunk (state=3): >>># destroy ansible.module_utils.distro <<< 28023 1726853611.11268: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # <<< 28023 1726853611.11295: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.11329: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.11365: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 28023 1726853611.11489: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 28023 1726853611.11492: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.11709: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.12011: stdout chunk (state=3): >>># zipimport: zlib available <<< 28023 1726853611.12153: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 28023 1726853611.12198: stdout chunk (state=3): >>># destroy __main__ <<< 28023 1726853611.12705: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 28023 1726853611.12733: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc <<< 28023 1726853611.12745: stdout chunk (state=3): >>># clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout <<< 28023 1726853611.12764: stdout chunk (state=3): >>># restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib <<< 28023 1726853611.12772: stdout chunk (state=3): >>># cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref <<< 28023 1726853611.12793: stdout chunk (state=3): >>># cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external<<< 28023 1726853611.13024: stdout chunk (state=3): >>> # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 28023 1726853611.13305: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 28023 1726853611.13397: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 28023 1726853611.13790: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 28023 1726853611.13819: stdout chunk (state=3): >>># destroy sys.monitoring <<< 28023 1726853611.13824: stdout chunk (state=3): >>># destroy _socket <<< 28023 1726853611.13855: stdout chunk (state=3): >>># destroy _collections <<< 28023 1726853611.13883: stdout chunk (state=3): >>># destroy platform <<< 28023 1726853611.13906: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 28023 1726853611.13911: stdout chunk (state=3): >>># destroy tokenize <<< 28023 1726853611.13941: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib <<< 28023 1726853611.13946: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib <<< 28023 1726853611.13990: stdout chunk (state=3): >>># destroy _typing <<< 28023 1726853611.14006: stdout chunk (state=3): >>># destroy _tokenize <<< 28023 1726853611.14035: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator <<< 28023 1726853611.14038: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves <<< 28023 1726853611.14052: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 28023 1726853611.14084: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib<<< 28023 1726853611.14087: stdout chunk (state=3): >>> <<< 28023 1726853611.14220: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs <<< 28023 1726853611.14266: stdout chunk (state=3): >>># destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 28023 1726853611.14281: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 28023 1726853611.14319: stdout chunk (state=3): >>># destroy _random <<< 28023 1726853611.14344: stdout chunk (state=3): >>># destroy _weakref # destroy _hashlib <<< 28023 1726853611.14351: stdout chunk (state=3): >>># destroy _operator <<< 28023 1726853611.14493: stdout chunk (state=3): >>># destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 28023 1726853611.14877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853611.14930: stderr chunk (state=3): >>><<< 28023 1726853611.14933: stdout chunk (state=3): >>><<< 28023 1726853611.15015: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88900184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888ffe7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f889001aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fe2d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fe2dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fe6bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fe6bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fea3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fea3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fe83b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fe812b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fe69070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fec37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fec23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fe82150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fec0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fef8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fe682f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888fef8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fef8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888fef8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fe66e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fef9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fef9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fefa540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888ff10740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888ff11e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888ff12cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888ff132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888ff12210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888ff13d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888ff134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fefa4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888fcdfc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888fd087a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fd08500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888fd087d0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888fd09100> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888fd09af0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fd089b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fcdddf0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fd0af00> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fd09c40> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fefac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fd2f230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fd57620> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fdb8380> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fdbaae0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fdb84a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fd793a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fbc1430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fd56420> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fd0be00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f888fd56780> # zipimport: found 30 names in '/tmp/ansible_stat_payload_mj0krpt2/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fc171d0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fbf60c0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fbf5220> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fc15070> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888fc3eb10> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fc3e8a0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fc3e1b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fc3e600> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fc17bf0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888fc3f8c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888fc3fb00> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fc3ff80> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f515df0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f5179e0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f518350> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f5194f0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f51bfe0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f520350> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f51a2a0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f523ec0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f5229c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f522720> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f522c60> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f51a7b0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f56bef0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f56c140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f56dc40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f56da00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f570200> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f56e330> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f5739e0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f5703b0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f574830> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f574a40> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f574d40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f56c350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f4004d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f401820> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f576c90> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f522c00> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f5768a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f4059a0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f406720> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f400620> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f406420> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f407950> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f888f4122d0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f40d280> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f502a20> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888fc7a6f0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f888f407260> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f889001b980> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 28023 1726853611.15602: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853610.477357-28160-46241368556790/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853611.15605: _low_level_execute_command(): starting 28023 1726853611.15610: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853610.477357-28160-46241368556790/ > /dev/null 2>&1 && sleep 0' 28023 1726853611.15837: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853611.15840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853611.15842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853611.15845: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853611.15847: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853611.15895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853611.15899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853611.15969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28023 1726853611.18536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853611.18562: stderr chunk (state=3): >>><<< 28023 1726853611.18565: stdout chunk (state=3): >>><<< 28023 1726853611.18580: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 28023 1726853611.18586: handler run complete 28023 1726853611.18607: attempt loop complete, returning result 28023 1726853611.18610: _execute() done 28023 1726853611.18612: dumping result to json 28023 1726853611.18614: done dumping result, returning 28023 1726853611.18622: done running TaskExecutor() for managed_node3/TASK: Check if system is ostree [02083763-bbaf-fdb6-dad7-0000000000d2] 28023 1726853611.18628: sending task result for task 02083763-bbaf-fdb6-dad7-0000000000d2 28023 1726853611.18714: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000000d2 28023 1726853611.18717: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 28023 1726853611.18772: no more pending results, returning what we have 28023 1726853611.18775: results queue empty 28023 1726853611.18776: checking for any_errors_fatal 28023 1726853611.18783: done checking for any_errors_fatal 28023 1726853611.18784: checking for max_fail_percentage 28023 1726853611.18785: done checking for max_fail_percentage 28023 1726853611.18786: checking to see if all hosts have failed and the running result is not ok 28023 1726853611.18787: done checking to see if all hosts have failed 28023 1726853611.18787: getting the remaining hosts for this loop 28023 1726853611.18789: done getting the remaining hosts for this loop 28023 1726853611.18792: getting the next task for host managed_node3 28023 1726853611.18798: done getting next task for host managed_node3 28023 1726853611.18800: ^ task is: TASK: Set flag to indicate system is ostree 28023 1726853611.18803: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853611.18806: getting variables 28023 1726853611.18807: in VariableManager get_vars() 28023 1726853611.18836: Calling all_inventory to load vars for managed_node3 28023 1726853611.18838: Calling groups_inventory to load vars for managed_node3 28023 1726853611.18841: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853611.18852: Calling all_plugins_play to load vars for managed_node3 28023 1726853611.18855: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853611.18857: Calling groups_plugins_play to load vars for managed_node3 28023 1726853611.19036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853611.19187: done with get_vars() 28023 1726853611.19199: done getting variables 28023 1726853611.19295: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 13:33:31 -0400 (0:00:00.791) 0:00:03.277 ****** 28023 1726853611.19323: entering _queue_task() for managed_node3/set_fact 28023 1726853611.19324: Creating lock for set_fact 28023 1726853611.19614: worker is 1 (out of 1 available) 28023 1726853611.19625: exiting _queue_task() for managed_node3/set_fact 28023 1726853611.19636: done queuing things up, now waiting for results queue to drain 28023 1726853611.19637: waiting for pending results... 28023 1726853611.19981: running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree 28023 1726853611.20019: in run() - task 02083763-bbaf-fdb6-dad7-0000000000d3 28023 1726853611.20052: variable 'ansible_search_path' from source: unknown 28023 1726853611.20056: variable 'ansible_search_path' from source: unknown 28023 1726853611.20105: calling self._execute() 28023 1726853611.20212: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853611.20216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853611.20230: variable 'omit' from source: magic vars 28023 1726853611.20651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853611.20825: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853611.20859: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853611.20885: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853611.20912: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853611.20972: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853611.20991: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853611.21009: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853611.21028: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853611.21117: Evaluated conditional (not __network_is_ostree is defined): True 28023 1726853611.21121: variable 'omit' from source: magic vars 28023 1726853611.21145: variable 'omit' from source: magic vars 28023 1726853611.21228: variable '__ostree_booted_stat' from source: set_fact 28023 1726853611.21266: variable 'omit' from source: magic vars 28023 1726853611.21287: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853611.21309: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853611.21323: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853611.21335: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853611.21346: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853611.21366: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853611.21370: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853611.21374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853611.21439: Set connection var ansible_shell_type to sh 28023 1726853611.21446: Set connection var ansible_shell_executable to /bin/sh 28023 1726853611.21451: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853611.21462: Set connection var ansible_connection to ssh 28023 1726853611.21467: Set connection var ansible_pipelining to False 28023 1726853611.21474: Set connection var ansible_timeout to 10 28023 1726853611.21494: variable 'ansible_shell_executable' from source: unknown 28023 1726853611.21497: variable 'ansible_connection' from source: unknown 28023 1726853611.21500: variable 'ansible_module_compression' from source: unknown 28023 1726853611.21502: variable 'ansible_shell_type' from source: unknown 28023 1726853611.21504: variable 'ansible_shell_executable' from source: unknown 28023 1726853611.21506: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853611.21509: variable 'ansible_pipelining' from source: unknown 28023 1726853611.21512: variable 'ansible_timeout' from source: unknown 28023 1726853611.21516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853611.21595: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853611.21603: variable 'omit' from source: magic vars 28023 1726853611.21608: starting attempt loop 28023 1726853611.21610: running the handler 28023 1726853611.21619: handler run complete 28023 1726853611.21635: attempt loop complete, returning result 28023 1726853611.21638: _execute() done 28023 1726853611.21641: dumping result to json 28023 1726853611.21644: done dumping result, returning 28023 1726853611.21646: done running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree [02083763-bbaf-fdb6-dad7-0000000000d3] 28023 1726853611.21648: sending task result for task 02083763-bbaf-fdb6-dad7-0000000000d3 28023 1726853611.21720: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000000d3 28023 1726853611.21723: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 28023 1726853611.21784: no more pending results, returning what we have 28023 1726853611.21787: results queue empty 28023 1726853611.21788: checking for any_errors_fatal 28023 1726853611.21793: done checking for any_errors_fatal 28023 1726853611.21794: checking for max_fail_percentage 28023 1726853611.21795: done checking for max_fail_percentage 28023 1726853611.21796: checking to see if all hosts have failed and the running result is not ok 28023 1726853611.21797: done checking to see if all hosts have failed 28023 1726853611.21798: getting the remaining hosts for this loop 28023 1726853611.21799: done getting the remaining hosts for this loop 28023 1726853611.21802: getting the next task for host managed_node3 28023 1726853611.21810: done getting next task for host managed_node3 28023 1726853611.21813: ^ task is: TASK: Fix CentOS6 Base repo 28023 1726853611.21815: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853611.21819: getting variables 28023 1726853611.21820: in VariableManager get_vars() 28023 1726853611.21850: Calling all_inventory to load vars for managed_node3 28023 1726853611.21853: Calling groups_inventory to load vars for managed_node3 28023 1726853611.21856: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853611.21866: Calling all_plugins_play to load vars for managed_node3 28023 1726853611.21868: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853611.21878: Calling groups_plugins_play to load vars for managed_node3 28023 1726853611.22049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853611.22250: done with get_vars() 28023 1726853611.22260: done getting variables 28023 1726853611.22368: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 13:33:31 -0400 (0:00:00.033) 0:00:03.310 ****** 28023 1726853611.22698: entering _queue_task() for managed_node3/copy 28023 1726853611.22957: worker is 1 (out of 1 available) 28023 1726853611.22968: exiting _queue_task() for managed_node3/copy 28023 1726853611.22982: done queuing things up, now waiting for results queue to drain 28023 1726853611.22984: waiting for pending results... 28023 1726853611.23301: running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo 28023 1726853611.23320: in run() - task 02083763-bbaf-fdb6-dad7-0000000000d5 28023 1726853611.23337: variable 'ansible_search_path' from source: unknown 28023 1726853611.23343: variable 'ansible_search_path' from source: unknown 28023 1726853611.23399: calling self._execute() 28023 1726853611.23507: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853611.23511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853611.23513: variable 'omit' from source: magic vars 28023 1726853611.23979: variable 'ansible_distribution' from source: facts 28023 1726853611.24007: Evaluated conditional (ansible_distribution == 'CentOS'): True 28023 1726853611.24140: variable 'ansible_distribution_major_version' from source: facts 28023 1726853611.24356: Evaluated conditional (ansible_distribution_major_version == '6'): False 28023 1726853611.24366: when evaluation is False, skipping this task 28023 1726853611.24369: _execute() done 28023 1726853611.24373: dumping result to json 28023 1726853611.24375: done dumping result, returning 28023 1726853611.24377: done running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo [02083763-bbaf-fdb6-dad7-0000000000d5] 28023 1726853611.24379: sending task result for task 02083763-bbaf-fdb6-dad7-0000000000d5 28023 1726853611.24448: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000000d5 28023 1726853611.24451: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 28023 1726853611.24532: no more pending results, returning what we have 28023 1726853611.24535: results queue empty 28023 1726853611.24536: checking for any_errors_fatal 28023 1726853611.24542: done checking for any_errors_fatal 28023 1726853611.24542: checking for max_fail_percentage 28023 1726853611.24544: done checking for max_fail_percentage 28023 1726853611.24544: checking to see if all hosts have failed and the running result is not ok 28023 1726853611.24545: done checking to see if all hosts have failed 28023 1726853611.24546: getting the remaining hosts for this loop 28023 1726853611.24547: done getting the remaining hosts for this loop 28023 1726853611.24551: getting the next task for host managed_node3 28023 1726853611.24557: done getting next task for host managed_node3 28023 1726853611.24560: ^ task is: TASK: Include the task 'enable_epel.yml' 28023 1726853611.24563: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853611.24567: getting variables 28023 1726853611.24569: in VariableManager get_vars() 28023 1726853611.24600: Calling all_inventory to load vars for managed_node3 28023 1726853611.24603: Calling groups_inventory to load vars for managed_node3 28023 1726853611.24606: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853611.24617: Calling all_plugins_play to load vars for managed_node3 28023 1726853611.24619: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853611.24622: Calling groups_plugins_play to load vars for managed_node3 28023 1726853611.25211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853611.25622: done with get_vars() 28023 1726853611.25634: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 13:33:31 -0400 (0:00:00.030) 0:00:03.341 ****** 28023 1726853611.25730: entering _queue_task() for managed_node3/include_tasks 28023 1726853611.26063: worker is 1 (out of 1 available) 28023 1726853611.26076: exiting _queue_task() for managed_node3/include_tasks 28023 1726853611.26087: done queuing things up, now waiting for results queue to drain 28023 1726853611.26089: waiting for pending results... 28023 1726853611.26689: running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' 28023 1726853611.26695: in run() - task 02083763-bbaf-fdb6-dad7-0000000000d6 28023 1726853611.26699: variable 'ansible_search_path' from source: unknown 28023 1726853611.26702: variable 'ansible_search_path' from source: unknown 28023 1726853611.26706: calling self._execute() 28023 1726853611.26927: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853611.27376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853611.27380: variable 'omit' from source: magic vars 28023 1726853611.27947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853611.31236: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853611.31326: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853611.31372: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853611.31411: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853611.31443: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853611.31530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853611.31566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853611.31598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853611.31640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853611.31661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853611.31786: variable '__network_is_ostree' from source: set_fact 28023 1726853611.31809: Evaluated conditional (not __network_is_ostree | d(false)): True 28023 1726853611.31820: _execute() done 28023 1726853611.31827: dumping result to json 28023 1726853611.31836: done dumping result, returning 28023 1726853611.31847: done running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' [02083763-bbaf-fdb6-dad7-0000000000d6] 28023 1726853611.31860: sending task result for task 02083763-bbaf-fdb6-dad7-0000000000d6 28023 1726853611.32003: no more pending results, returning what we have 28023 1726853611.32008: in VariableManager get_vars() 28023 1726853611.32041: Calling all_inventory to load vars for managed_node3 28023 1726853611.32044: Calling groups_inventory to load vars for managed_node3 28023 1726853611.32047: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853611.32067: Calling all_plugins_play to load vars for managed_node3 28023 1726853611.32070: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853611.32166: Calling groups_plugins_play to load vars for managed_node3 28023 1726853611.32379: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000000d6 28023 1726853611.32382: WORKER PROCESS EXITING 28023 1726853611.32403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853611.32639: done with get_vars() 28023 1726853611.32649: variable 'ansible_search_path' from source: unknown 28023 1726853611.32650: variable 'ansible_search_path' from source: unknown 28023 1726853611.32691: we have included files to process 28023 1726853611.32692: generating all_blocks data 28023 1726853611.32693: done generating all_blocks data 28023 1726853611.32710: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 28023 1726853611.32712: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 28023 1726853611.32715: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 28023 1726853611.33454: done processing included file 28023 1726853611.33457: iterating over new_blocks loaded from include file 28023 1726853611.33458: in VariableManager get_vars() 28023 1726853611.33482: done with get_vars() 28023 1726853611.33483: filtering new block on tags 28023 1726853611.33523: done filtering new block on tags 28023 1726853611.33527: in VariableManager get_vars() 28023 1726853611.33537: done with get_vars() 28023 1726853611.33539: filtering new block on tags 28023 1726853611.33551: done filtering new block on tags 28023 1726853611.33553: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node3 28023 1726853611.33559: extending task lists for all hosts with included blocks 28023 1726853611.33667: done extending task lists 28023 1726853611.33668: done processing included files 28023 1726853611.33669: results queue empty 28023 1726853611.33670: checking for any_errors_fatal 28023 1726853611.33674: done checking for any_errors_fatal 28023 1726853611.33675: checking for max_fail_percentage 28023 1726853611.33676: done checking for max_fail_percentage 28023 1726853611.33677: checking to see if all hosts have failed and the running result is not ok 28023 1726853611.33678: done checking to see if all hosts have failed 28023 1726853611.33679: getting the remaining hosts for this loop 28023 1726853611.33689: done getting the remaining hosts for this loop 28023 1726853611.33691: getting the next task for host managed_node3 28023 1726853611.33695: done getting next task for host managed_node3 28023 1726853611.33698: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 28023 1726853611.33700: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853611.33703: getting variables 28023 1726853611.33704: in VariableManager get_vars() 28023 1726853611.33712: Calling all_inventory to load vars for managed_node3 28023 1726853611.33714: Calling groups_inventory to load vars for managed_node3 28023 1726853611.33716: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853611.33720: Calling all_plugins_play to load vars for managed_node3 28023 1726853611.33728: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853611.33731: Calling groups_plugins_play to load vars for managed_node3 28023 1726853611.33884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853611.34097: done with get_vars() 28023 1726853611.34105: done getting variables 28023 1726853611.34181: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 28023 1726853611.34399: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 13:33:31 -0400 (0:00:00.087) 0:00:03.428 ****** 28023 1726853611.34453: entering _queue_task() for managed_node3/command 28023 1726853611.34455: Creating lock for command 28023 1726853611.34768: worker is 1 (out of 1 available) 28023 1726853611.34788: exiting _queue_task() for managed_node3/command 28023 1726853611.34798: done queuing things up, now waiting for results queue to drain 28023 1726853611.34799: waiting for pending results... 28023 1726853611.35047: running TaskExecutor() for managed_node3/TASK: Create EPEL 10 28023 1726853611.35163: in run() - task 02083763-bbaf-fdb6-dad7-0000000000f0 28023 1726853611.35181: variable 'ansible_search_path' from source: unknown 28023 1726853611.35189: variable 'ansible_search_path' from source: unknown 28023 1726853611.35238: calling self._execute() 28023 1726853611.35308: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853611.35328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853611.35341: variable 'omit' from source: magic vars 28023 1726853611.35784: variable 'ansible_distribution' from source: facts 28023 1726853611.35802: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 28023 1726853611.35950: variable 'ansible_distribution_major_version' from source: facts 28023 1726853611.35983: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 28023 1726853611.35986: when evaluation is False, skipping this task 28023 1726853611.36077: _execute() done 28023 1726853611.36082: dumping result to json 28023 1726853611.36085: done dumping result, returning 28023 1726853611.36088: done running TaskExecutor() for managed_node3/TASK: Create EPEL 10 [02083763-bbaf-fdb6-dad7-0000000000f0] 28023 1726853611.36091: sending task result for task 02083763-bbaf-fdb6-dad7-0000000000f0 28023 1726853611.36162: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000000f0 28023 1726853611.36165: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 28023 1726853611.36218: no more pending results, returning what we have 28023 1726853611.36222: results queue empty 28023 1726853611.36223: checking for any_errors_fatal 28023 1726853611.36224: done checking for any_errors_fatal 28023 1726853611.36225: checking for max_fail_percentage 28023 1726853611.36226: done checking for max_fail_percentage 28023 1726853611.36227: checking to see if all hosts have failed and the running result is not ok 28023 1726853611.36228: done checking to see if all hosts have failed 28023 1726853611.36229: getting the remaining hosts for this loop 28023 1726853611.36230: done getting the remaining hosts for this loop 28023 1726853611.36235: getting the next task for host managed_node3 28023 1726853611.36242: done getting next task for host managed_node3 28023 1726853611.36245: ^ task is: TASK: Install yum-utils package 28023 1726853611.36249: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853611.36253: getting variables 28023 1726853611.36255: in VariableManager get_vars() 28023 1726853611.36393: Calling all_inventory to load vars for managed_node3 28023 1726853611.36396: Calling groups_inventory to load vars for managed_node3 28023 1726853611.36400: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853611.36414: Calling all_plugins_play to load vars for managed_node3 28023 1726853611.36417: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853611.36421: Calling groups_plugins_play to load vars for managed_node3 28023 1726853611.36794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853611.37032: done with get_vars() 28023 1726853611.37042: done getting variables 28023 1726853611.37149: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 13:33:31 -0400 (0:00:00.027) 0:00:03.455 ****** 28023 1726853611.37180: entering _queue_task() for managed_node3/package 28023 1726853611.37182: Creating lock for package 28023 1726853611.37581: worker is 1 (out of 1 available) 28023 1726853611.37591: exiting _queue_task() for managed_node3/package 28023 1726853611.37601: done queuing things up, now waiting for results queue to drain 28023 1726853611.37603: waiting for pending results... 28023 1726853611.37886: running TaskExecutor() for managed_node3/TASK: Install yum-utils package 28023 1726853611.37898: in run() - task 02083763-bbaf-fdb6-dad7-0000000000f1 28023 1726853611.37901: variable 'ansible_search_path' from source: unknown 28023 1726853611.37904: variable 'ansible_search_path' from source: unknown 28023 1726853611.37948: calling self._execute() 28023 1726853611.38047: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853611.38051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853611.38057: variable 'omit' from source: magic vars 28023 1726853611.38405: variable 'ansible_distribution' from source: facts 28023 1726853611.38415: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 28023 1726853611.38502: variable 'ansible_distribution_major_version' from source: facts 28023 1726853611.38507: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 28023 1726853611.38510: when evaluation is False, skipping this task 28023 1726853611.38512: _execute() done 28023 1726853611.38515: dumping result to json 28023 1726853611.38520: done dumping result, returning 28023 1726853611.38526: done running TaskExecutor() for managed_node3/TASK: Install yum-utils package [02083763-bbaf-fdb6-dad7-0000000000f1] 28023 1726853611.38531: sending task result for task 02083763-bbaf-fdb6-dad7-0000000000f1 28023 1726853611.38617: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000000f1 28023 1726853611.38620: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 28023 1726853611.38666: no more pending results, returning what we have 28023 1726853611.38669: results queue empty 28023 1726853611.38670: checking for any_errors_fatal 28023 1726853611.38676: done checking for any_errors_fatal 28023 1726853611.38676: checking for max_fail_percentage 28023 1726853611.38678: done checking for max_fail_percentage 28023 1726853611.38678: checking to see if all hosts have failed and the running result is not ok 28023 1726853611.38679: done checking to see if all hosts have failed 28023 1726853611.38680: getting the remaining hosts for this loop 28023 1726853611.38682: done getting the remaining hosts for this loop 28023 1726853611.38685: getting the next task for host managed_node3 28023 1726853611.38691: done getting next task for host managed_node3 28023 1726853611.38692: ^ task is: TASK: Enable EPEL 7 28023 1726853611.38696: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853611.38698: getting variables 28023 1726853611.38700: in VariableManager get_vars() 28023 1726853611.38723: Calling all_inventory to load vars for managed_node3 28023 1726853611.38725: Calling groups_inventory to load vars for managed_node3 28023 1726853611.38728: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853611.38736: Calling all_plugins_play to load vars for managed_node3 28023 1726853611.38738: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853611.38741: Calling groups_plugins_play to load vars for managed_node3 28023 1726853611.38863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853611.39008: done with get_vars() 28023 1726853611.39014: done getting variables 28023 1726853611.39052: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 13:33:31 -0400 (0:00:00.018) 0:00:03.474 ****** 28023 1726853611.39073: entering _queue_task() for managed_node3/command 28023 1726853611.39242: worker is 1 (out of 1 available) 28023 1726853611.39255: exiting _queue_task() for managed_node3/command 28023 1726853611.39264: done queuing things up, now waiting for results queue to drain 28023 1726853611.39265: waiting for pending results... 28023 1726853611.39417: running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 28023 1726853611.39489: in run() - task 02083763-bbaf-fdb6-dad7-0000000000f2 28023 1726853611.39496: variable 'ansible_search_path' from source: unknown 28023 1726853611.39505: variable 'ansible_search_path' from source: unknown 28023 1726853611.39532: calling self._execute() 28023 1726853611.39586: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853611.39589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853611.39599: variable 'omit' from source: magic vars 28023 1726853611.39849: variable 'ansible_distribution' from source: facts 28023 1726853611.39858: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 28023 1726853611.40076: variable 'ansible_distribution_major_version' from source: facts 28023 1726853611.40079: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 28023 1726853611.40082: when evaluation is False, skipping this task 28023 1726853611.40084: _execute() done 28023 1726853611.40086: dumping result to json 28023 1726853611.40089: done dumping result, returning 28023 1726853611.40091: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 [02083763-bbaf-fdb6-dad7-0000000000f2] 28023 1726853611.40094: sending task result for task 02083763-bbaf-fdb6-dad7-0000000000f2 28023 1726853611.40154: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000000f2 28023 1726853611.40157: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 28023 1726853611.40203: no more pending results, returning what we have 28023 1726853611.40207: results queue empty 28023 1726853611.40208: checking for any_errors_fatal 28023 1726853611.40212: done checking for any_errors_fatal 28023 1726853611.40213: checking for max_fail_percentage 28023 1726853611.40215: done checking for max_fail_percentage 28023 1726853611.40215: checking to see if all hosts have failed and the running result is not ok 28023 1726853611.40216: done checking to see if all hosts have failed 28023 1726853611.40217: getting the remaining hosts for this loop 28023 1726853611.40219: done getting the remaining hosts for this loop 28023 1726853611.40222: getting the next task for host managed_node3 28023 1726853611.40230: done getting next task for host managed_node3 28023 1726853611.40233: ^ task is: TASK: Enable EPEL 8 28023 1726853611.40237: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853611.40241: getting variables 28023 1726853611.40243: in VariableManager get_vars() 28023 1726853611.40273: Calling all_inventory to load vars for managed_node3 28023 1726853611.40276: Calling groups_inventory to load vars for managed_node3 28023 1726853611.40280: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853611.40294: Calling all_plugins_play to load vars for managed_node3 28023 1726853611.40297: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853611.40300: Calling groups_plugins_play to load vars for managed_node3 28023 1726853611.40631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853611.40834: done with get_vars() 28023 1726853611.40843: done getting variables 28023 1726853611.40901: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 13:33:31 -0400 (0:00:00.018) 0:00:03.493 ****** 28023 1726853611.40926: entering _queue_task() for managed_node3/command 28023 1726853611.41146: worker is 1 (out of 1 available) 28023 1726853611.41158: exiting _queue_task() for managed_node3/command 28023 1726853611.41168: done queuing things up, now waiting for results queue to drain 28023 1726853611.41169: waiting for pending results... 28023 1726853611.41331: running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 28023 1726853611.41397: in run() - task 02083763-bbaf-fdb6-dad7-0000000000f3 28023 1726853611.41408: variable 'ansible_search_path' from source: unknown 28023 1726853611.41412: variable 'ansible_search_path' from source: unknown 28023 1726853611.41438: calling self._execute() 28023 1726853611.41494: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853611.41502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853611.41507: variable 'omit' from source: magic vars 28023 1726853611.41753: variable 'ansible_distribution' from source: facts 28023 1726853611.41764: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 28023 1726853611.41851: variable 'ansible_distribution_major_version' from source: facts 28023 1726853611.41858: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 28023 1726853611.41861: when evaluation is False, skipping this task 28023 1726853611.41863: _execute() done 28023 1726853611.41866: dumping result to json 28023 1726853611.41869: done dumping result, returning 28023 1726853611.41876: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 [02083763-bbaf-fdb6-dad7-0000000000f3] 28023 1726853611.41881: sending task result for task 02083763-bbaf-fdb6-dad7-0000000000f3 28023 1726853611.41964: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000000f3 28023 1726853611.41967: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 28023 1726853611.42018: no more pending results, returning what we have 28023 1726853611.42021: results queue empty 28023 1726853611.42021: checking for any_errors_fatal 28023 1726853611.42025: done checking for any_errors_fatal 28023 1726853611.42026: checking for max_fail_percentage 28023 1726853611.42027: done checking for max_fail_percentage 28023 1726853611.42028: checking to see if all hosts have failed and the running result is not ok 28023 1726853611.42029: done checking to see if all hosts have failed 28023 1726853611.42030: getting the remaining hosts for this loop 28023 1726853611.42031: done getting the remaining hosts for this loop 28023 1726853611.42034: getting the next task for host managed_node3 28023 1726853611.42041: done getting next task for host managed_node3 28023 1726853611.42043: ^ task is: TASK: Enable EPEL 6 28023 1726853611.42046: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853611.42049: getting variables 28023 1726853611.42050: in VariableManager get_vars() 28023 1726853611.42085: Calling all_inventory to load vars for managed_node3 28023 1726853611.42087: Calling groups_inventory to load vars for managed_node3 28023 1726853611.42089: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853611.42096: Calling all_plugins_play to load vars for managed_node3 28023 1726853611.42098: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853611.42100: Calling groups_plugins_play to load vars for managed_node3 28023 1726853611.42362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853611.42482: done with get_vars() 28023 1726853611.42489: done getting variables 28023 1726853611.42528: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 13:33:31 -0400 (0:00:00.016) 0:00:03.509 ****** 28023 1726853611.42547: entering _queue_task() for managed_node3/copy 28023 1726853611.42723: worker is 1 (out of 1 available) 28023 1726853611.42735: exiting _queue_task() for managed_node3/copy 28023 1726853611.42744: done queuing things up, now waiting for results queue to drain 28023 1726853611.42745: waiting for pending results... 28023 1726853611.42888: running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 28023 1726853611.42989: in run() - task 02083763-bbaf-fdb6-dad7-0000000000f5 28023 1726853611.42993: variable 'ansible_search_path' from source: unknown 28023 1726853611.42996: variable 'ansible_search_path' from source: unknown 28023 1726853611.43012: calling self._execute() 28023 1726853611.43275: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853611.43278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853611.43281: variable 'omit' from source: magic vars 28023 1726853611.43459: variable 'ansible_distribution' from source: facts 28023 1726853611.43479: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 28023 1726853611.43587: variable 'ansible_distribution_major_version' from source: facts 28023 1726853611.43599: Evaluated conditional (ansible_distribution_major_version == '6'): False 28023 1726853611.43607: when evaluation is False, skipping this task 28023 1726853611.43615: _execute() done 28023 1726853611.43624: dumping result to json 28023 1726853611.43632: done dumping result, returning 28023 1726853611.43642: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 [02083763-bbaf-fdb6-dad7-0000000000f5] 28023 1726853611.43651: sending task result for task 02083763-bbaf-fdb6-dad7-0000000000f5 28023 1726853611.43747: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000000f5 28023 1726853611.43750: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 28023 1726853611.43794: no more pending results, returning what we have 28023 1726853611.43797: results queue empty 28023 1726853611.43797: checking for any_errors_fatal 28023 1726853611.43803: done checking for any_errors_fatal 28023 1726853611.43804: checking for max_fail_percentage 28023 1726853611.43805: done checking for max_fail_percentage 28023 1726853611.43806: checking to see if all hosts have failed and the running result is not ok 28023 1726853611.43807: done checking to see if all hosts have failed 28023 1726853611.43808: getting the remaining hosts for this loop 28023 1726853611.43809: done getting the remaining hosts for this loop 28023 1726853611.43812: getting the next task for host managed_node3 28023 1726853611.43821: done getting next task for host managed_node3 28023 1726853611.43823: ^ task is: TASK: Set network provider to 'nm' 28023 1726853611.43826: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853611.43830: getting variables 28023 1726853611.43833: in VariableManager get_vars() 28023 1726853611.43861: Calling all_inventory to load vars for managed_node3 28023 1726853611.43863: Calling groups_inventory to load vars for managed_node3 28023 1726853611.43866: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853611.43880: Calling all_plugins_play to load vars for managed_node3 28023 1726853611.43882: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853611.43885: Calling groups_plugins_play to load vars for managed_node3 28023 1726853611.44078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853611.44307: done with get_vars() 28023 1726853611.44316: done getting variables 28023 1726853611.44384: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:13 Friday 20 September 2024 13:33:31 -0400 (0:00:00.018) 0:00:03.527 ****** 28023 1726853611.44409: entering _queue_task() for managed_node3/set_fact 28023 1726853611.44679: worker is 1 (out of 1 available) 28023 1726853611.44697: exiting _queue_task() for managed_node3/set_fact 28023 1726853611.44707: done queuing things up, now waiting for results queue to drain 28023 1726853611.44708: waiting for pending results... 28023 1726853611.44887: running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' 28023 1726853611.44935: in run() - task 02083763-bbaf-fdb6-dad7-000000000007 28023 1726853611.44945: variable 'ansible_search_path' from source: unknown 28023 1726853611.44975: calling self._execute() 28023 1726853611.45031: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853611.45038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853611.45048: variable 'omit' from source: magic vars 28023 1726853611.45122: variable 'omit' from source: magic vars 28023 1726853611.45143: variable 'omit' from source: magic vars 28023 1726853611.45172: variable 'omit' from source: magic vars 28023 1726853611.45204: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853611.45232: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853611.45249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853611.45263: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853611.45276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853611.45298: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853611.45301: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853611.45304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853611.45373: Set connection var ansible_shell_type to sh 28023 1726853611.45382: Set connection var ansible_shell_executable to /bin/sh 28023 1726853611.45385: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853611.45390: Set connection var ansible_connection to ssh 28023 1726853611.45395: Set connection var ansible_pipelining to False 28023 1726853611.45399: Set connection var ansible_timeout to 10 28023 1726853611.45419: variable 'ansible_shell_executable' from source: unknown 28023 1726853611.45421: variable 'ansible_connection' from source: unknown 28023 1726853611.45424: variable 'ansible_module_compression' from source: unknown 28023 1726853611.45426: variable 'ansible_shell_type' from source: unknown 28023 1726853611.45428: variable 'ansible_shell_executable' from source: unknown 28023 1726853611.45432: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853611.45436: variable 'ansible_pipelining' from source: unknown 28023 1726853611.45438: variable 'ansible_timeout' from source: unknown 28023 1726853611.45446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853611.45544: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853611.45558: variable 'omit' from source: magic vars 28023 1726853611.45561: starting attempt loop 28023 1726853611.45563: running the handler 28023 1726853611.45570: handler run complete 28023 1726853611.45579: attempt loop complete, returning result 28023 1726853611.45583: _execute() done 28023 1726853611.45586: dumping result to json 28023 1726853611.45588: done dumping result, returning 28023 1726853611.45599: done running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' [02083763-bbaf-fdb6-dad7-000000000007] 28023 1726853611.45602: sending task result for task 02083763-bbaf-fdb6-dad7-000000000007 28023 1726853611.45677: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000007 28023 1726853611.45680: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 28023 1726853611.45744: no more pending results, returning what we have 28023 1726853611.45747: results queue empty 28023 1726853611.45748: checking for any_errors_fatal 28023 1726853611.45753: done checking for any_errors_fatal 28023 1726853611.45756: checking for max_fail_percentage 28023 1726853611.45757: done checking for max_fail_percentage 28023 1726853611.45758: checking to see if all hosts have failed and the running result is not ok 28023 1726853611.45759: done checking to see if all hosts have failed 28023 1726853611.45760: getting the remaining hosts for this loop 28023 1726853611.45761: done getting the remaining hosts for this loop 28023 1726853611.45763: getting the next task for host managed_node3 28023 1726853611.45769: done getting next task for host managed_node3 28023 1726853611.45772: ^ task is: TASK: meta (flush_handlers) 28023 1726853611.45774: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853611.45778: getting variables 28023 1726853611.45779: in VariableManager get_vars() 28023 1726853611.45802: Calling all_inventory to load vars for managed_node3 28023 1726853611.45805: Calling groups_inventory to load vars for managed_node3 28023 1726853611.45807: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853611.45816: Calling all_plugins_play to load vars for managed_node3 28023 1726853611.45818: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853611.45820: Calling groups_plugins_play to load vars for managed_node3 28023 1726853611.45968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853611.46087: done with get_vars() 28023 1726853611.46093: done getting variables 28023 1726853611.46134: in VariableManager get_vars() 28023 1726853611.46139: Calling all_inventory to load vars for managed_node3 28023 1726853611.46141: Calling groups_inventory to load vars for managed_node3 28023 1726853611.46142: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853611.46145: Calling all_plugins_play to load vars for managed_node3 28023 1726853611.46146: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853611.46148: Calling groups_plugins_play to load vars for managed_node3 28023 1726853611.46260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853611.46459: done with get_vars() 28023 1726853611.46472: done queuing things up, now waiting for results queue to drain 28023 1726853611.46474: results queue empty 28023 1726853611.46474: checking for any_errors_fatal 28023 1726853611.46476: done checking for any_errors_fatal 28023 1726853611.46477: checking for max_fail_percentage 28023 1726853611.46478: done checking for max_fail_percentage 28023 1726853611.46478: checking to see if all hosts have failed and the running result is not ok 28023 1726853611.46479: done checking to see if all hosts have failed 28023 1726853611.46480: getting the remaining hosts for this loop 28023 1726853611.46480: done getting the remaining hosts for this loop 28023 1726853611.46482: getting the next task for host managed_node3 28023 1726853611.46486: done getting next task for host managed_node3 28023 1726853611.46487: ^ task is: TASK: meta (flush_handlers) 28023 1726853611.46488: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853611.46495: getting variables 28023 1726853611.46495: in VariableManager get_vars() 28023 1726853611.46502: Calling all_inventory to load vars for managed_node3 28023 1726853611.46504: Calling groups_inventory to load vars for managed_node3 28023 1726853611.46506: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853611.46510: Calling all_plugins_play to load vars for managed_node3 28023 1726853611.46512: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853611.46514: Calling groups_plugins_play to load vars for managed_node3 28023 1726853611.46659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853611.46884: done with get_vars() 28023 1726853611.46895: done getting variables 28023 1726853611.46937: in VariableManager get_vars() 28023 1726853611.46946: Calling all_inventory to load vars for managed_node3 28023 1726853611.46948: Calling groups_inventory to load vars for managed_node3 28023 1726853611.46949: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853611.46952: Calling all_plugins_play to load vars for managed_node3 28023 1726853611.46955: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853611.46957: Calling groups_plugins_play to load vars for managed_node3 28023 1726853611.47065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853611.47239: done with get_vars() 28023 1726853611.47250: done queuing things up, now waiting for results queue to drain 28023 1726853611.47251: results queue empty 28023 1726853611.47252: checking for any_errors_fatal 28023 1726853611.47253: done checking for any_errors_fatal 28023 1726853611.47257: checking for max_fail_percentage 28023 1726853611.47258: done checking for max_fail_percentage 28023 1726853611.47258: checking to see if all hosts have failed and the running result is not ok 28023 1726853611.47259: done checking to see if all hosts have failed 28023 1726853611.47260: getting the remaining hosts for this loop 28023 1726853611.47260: done getting the remaining hosts for this loop 28023 1726853611.47263: getting the next task for host managed_node3 28023 1726853611.47265: done getting next task for host managed_node3 28023 1726853611.47266: ^ task is: None 28023 1726853611.47267: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853611.47268: done queuing things up, now waiting for results queue to drain 28023 1726853611.47269: results queue empty 28023 1726853611.47270: checking for any_errors_fatal 28023 1726853611.47272: done checking for any_errors_fatal 28023 1726853611.47273: checking for max_fail_percentage 28023 1726853611.47274: done checking for max_fail_percentage 28023 1726853611.47275: checking to see if all hosts have failed and the running result is not ok 28023 1726853611.47275: done checking to see if all hosts have failed 28023 1726853611.47277: getting the next task for host managed_node3 28023 1726853611.47279: done getting next task for host managed_node3 28023 1726853611.47280: ^ task is: None 28023 1726853611.47282: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853611.47321: in VariableManager get_vars() 28023 1726853611.47343: done with get_vars() 28023 1726853611.47351: in VariableManager get_vars() 28023 1726853611.47369: done with get_vars() 28023 1726853611.47376: variable 'omit' from source: magic vars 28023 1726853611.47405: in VariableManager get_vars() 28023 1726853611.47422: done with get_vars() 28023 1726853611.47445: variable 'omit' from source: magic vars PLAY [Test output device of routes] ******************************************** 28023 1726853611.47913: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 28023 1726853611.47938: getting the remaining hosts for this loop 28023 1726853611.47939: done getting the remaining hosts for this loop 28023 1726853611.47942: getting the next task for host managed_node3 28023 1726853611.47944: done getting next task for host managed_node3 28023 1726853611.47946: ^ task is: TASK: Gathering Facts 28023 1726853611.47947: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853611.47949: getting variables 28023 1726853611.47950: in VariableManager get_vars() 28023 1726853611.47964: Calling all_inventory to load vars for managed_node3 28023 1726853611.47967: Calling groups_inventory to load vars for managed_node3 28023 1726853611.47968: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853611.47976: Calling all_plugins_play to load vars for managed_node3 28023 1726853611.47990: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853611.47994: Calling groups_plugins_play to load vars for managed_node3 28023 1726853611.48134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853611.48282: done with get_vars() 28023 1726853611.48287: done getting variables 28023 1726853611.48312: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:3 Friday 20 September 2024 13:33:31 -0400 (0:00:00.039) 0:00:03.567 ****** 28023 1726853611.48334: entering _queue_task() for managed_node3/gather_facts 28023 1726853611.48532: worker is 1 (out of 1 available) 28023 1726853611.48545: exiting _queue_task() for managed_node3/gather_facts 28023 1726853611.48561: done queuing things up, now waiting for results queue to drain 28023 1726853611.48562: waiting for pending results... 28023 1726853611.48722: running TaskExecutor() for managed_node3/TASK: Gathering Facts 28023 1726853611.48776: in run() - task 02083763-bbaf-fdb6-dad7-00000000011b 28023 1726853611.48792: variable 'ansible_search_path' from source: unknown 28023 1726853611.48823: calling self._execute() 28023 1726853611.48885: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853611.48889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853611.48897: variable 'omit' from source: magic vars 28023 1726853611.49156: variable 'ansible_distribution_major_version' from source: facts 28023 1726853611.49169: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853611.49174: variable 'omit' from source: magic vars 28023 1726853611.49193: variable 'omit' from source: magic vars 28023 1726853611.49216: variable 'omit' from source: magic vars 28023 1726853611.49247: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853611.49279: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853611.49295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853611.49307: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853611.49317: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853611.49340: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853611.49343: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853611.49345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853611.49414: Set connection var ansible_shell_type to sh 28023 1726853611.49420: Set connection var ansible_shell_executable to /bin/sh 28023 1726853611.49425: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853611.49431: Set connection var ansible_connection to ssh 28023 1726853611.49438: Set connection var ansible_pipelining to False 28023 1726853611.49441: Set connection var ansible_timeout to 10 28023 1726853611.49462: variable 'ansible_shell_executable' from source: unknown 28023 1726853611.49465: variable 'ansible_connection' from source: unknown 28023 1726853611.49469: variable 'ansible_module_compression' from source: unknown 28023 1726853611.49473: variable 'ansible_shell_type' from source: unknown 28023 1726853611.49476: variable 'ansible_shell_executable' from source: unknown 28023 1726853611.49478: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853611.49481: variable 'ansible_pipelining' from source: unknown 28023 1726853611.49484: variable 'ansible_timeout' from source: unknown 28023 1726853611.49486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853611.49616: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853611.49624: variable 'omit' from source: magic vars 28023 1726853611.49629: starting attempt loop 28023 1726853611.49631: running the handler 28023 1726853611.49644: variable 'ansible_facts' from source: unknown 28023 1726853611.49663: _low_level_execute_command(): starting 28023 1726853611.49670: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853611.50353: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853611.50414: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853611.50418: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853611.50497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28023 1726853611.52902: stdout chunk (state=3): >>>/root <<< 28023 1726853611.53064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853611.53091: stderr chunk (state=3): >>><<< 28023 1726853611.53097: stdout chunk (state=3): >>><<< 28023 1726853611.53115: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 28023 1726853611.53127: _low_level_execute_command(): starting 28023 1726853611.53132: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853611.5311549-28209-61650815491082 `" && echo ansible-tmp-1726853611.5311549-28209-61650815491082="` echo /root/.ansible/tmp/ansible-tmp-1726853611.5311549-28209-61650815491082 `" ) && sleep 0' 28023 1726853611.53566: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853611.53570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853611.53574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28023 1726853611.53576: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853611.53589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853611.53628: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853611.53631: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853611.53701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28023 1726853611.56545: stdout chunk (state=3): >>>ansible-tmp-1726853611.5311549-28209-61650815491082=/root/.ansible/tmp/ansible-tmp-1726853611.5311549-28209-61650815491082 <<< 28023 1726853611.56716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853611.56743: stderr chunk (state=3): >>><<< 28023 1726853611.56747: stdout chunk (state=3): >>><<< 28023 1726853611.56976: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853611.5311549-28209-61650815491082=/root/.ansible/tmp/ansible-tmp-1726853611.5311549-28209-61650815491082 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 28023 1726853611.56980: variable 'ansible_module_compression' from source: unknown 28023 1726853611.56982: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 28023 1726853611.56985: variable 'ansible_facts' from source: unknown 28023 1726853611.57174: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853611.5311549-28209-61650815491082/AnsiballZ_setup.py 28023 1726853611.57476: Sending initial data 28023 1726853611.57479: Sent initial data (153 bytes) 28023 1726853611.58055: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853611.58194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853611.58264: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28023 1726853611.60558: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853611.60640: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853611.60733: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpvlh76tg6 /root/.ansible/tmp/ansible-tmp-1726853611.5311549-28209-61650815491082/AnsiballZ_setup.py <<< 28023 1726853611.60736: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853611.5311549-28209-61650815491082/AnsiballZ_setup.py" <<< 28023 1726853611.60794: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpvlh76tg6" to remote "/root/.ansible/tmp/ansible-tmp-1726853611.5311549-28209-61650815491082/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853611.5311549-28209-61650815491082/AnsiballZ_setup.py" <<< 28023 1726853611.62778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853611.62781: stdout chunk (state=3): >>><<< 28023 1726853611.62784: stderr chunk (state=3): >>><<< 28023 1726853611.62786: done transferring module to remote 28023 1726853611.62788: _low_level_execute_command(): starting 28023 1726853611.62790: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853611.5311549-28209-61650815491082/ /root/.ansible/tmp/ansible-tmp-1726853611.5311549-28209-61650815491082/AnsiballZ_setup.py && sleep 0' 28023 1726853611.63525: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853611.63561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853611.63653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28023 1726853611.66385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853611.66389: stdout chunk (state=3): >>><<< 28023 1726853611.66392: stderr chunk (state=3): >>><<< 28023 1726853611.66394: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 28023 1726853611.66401: _low_level_execute_command(): starting 28023 1726853611.66404: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853611.5311549-28209-61650815491082/AnsiballZ_setup.py && sleep 0' 28023 1726853611.67092: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853611.67122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853611.67146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853611.67164: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853611.67269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28023 1726853612.45474: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.58251953125, "5m": 0.50146484375, "15m": 0.2939453125}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_lsb": {}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "33", "second": "32", "epoch": "1726853612", "epoch_int": "1726853612", "date": "2024-09-20", "time": "13:33:32", "iso8601_micro": "2024-09-20T17:33:32.076779Z", "iso8601": "2024-09-20T17:33:32Z", "iso8601_basic": "20240920T133332076779", "iso8601_basic_short": "20240920T133332", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2960, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 571, "free": 2960}, "nocache": {"free": 3277, "used": 254}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 756, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261802799104, "block_size": 4096, "block_total": 65519099, "block_available": 63916699, "block_used": 1602400, "inode_total": 131070960, "inode_available": 131029146, "inode_used": 41814, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_interfaces": ["lo", "rpltstbr", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:7c:f1:8e:1c:81", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9"]}, "ansible_is_chroot": false, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 28023 1726853612.47684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853612.47710: stderr chunk (state=3): >>><<< 28023 1726853612.47715: stdout chunk (state=3): >>><<< 28023 1726853612.47834: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.58251953125, "5m": 0.50146484375, "15m": 0.2939453125}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_lsb": {}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "33", "second": "32", "epoch": "1726853612", "epoch_int": "1726853612", "date": "2024-09-20", "time": "13:33:32", "iso8601_micro": "2024-09-20T17:33:32.076779Z", "iso8601": "2024-09-20T17:33:32Z", "iso8601_basic": "20240920T133332076779", "iso8601_basic_short": "20240920T133332", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2960, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 571, "free": 2960}, "nocache": {"free": 3277, "used": 254}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 756, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261802799104, "block_size": 4096, "block_total": 65519099, "block_available": 63916699, "block_used": 1602400, "inode_total": 131070960, "inode_available": 131029146, "inode_used": 41814, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_interfaces": ["lo", "rpltstbr", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:7c:f1:8e:1c:81", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9"]}, "ansible_is_chroot": false, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853612.48648: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853611.5311549-28209-61650815491082/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853612.48777: _low_level_execute_command(): starting 28023 1726853612.48780: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853611.5311549-28209-61650815491082/ > /dev/null 2>&1 && sleep 0' 28023 1726853612.50101: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853612.50369: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853612.50400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853612.50501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853612.52728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853612.52731: stdout chunk (state=3): >>><<< 28023 1726853612.52734: stderr chunk (state=3): >>><<< 28023 1726853612.52781: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853612.53084: handler run complete 28023 1726853612.53146: variable 'ansible_facts' from source: unknown 28023 1726853612.53300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853612.53835: variable 'ansible_facts' from source: unknown 28023 1726853612.53986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853612.54286: attempt loop complete, returning result 28023 1726853612.54289: _execute() done 28023 1726853612.54291: dumping result to json 28023 1726853612.54345: done dumping result, returning 28023 1726853612.54361: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [02083763-bbaf-fdb6-dad7-00000000011b] 28023 1726853612.54373: sending task result for task 02083763-bbaf-fdb6-dad7-00000000011b 28023 1726853612.55167: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000011b 28023 1726853612.55173: WORKER PROCESS EXITING ok: [managed_node3] 28023 1726853612.55562: no more pending results, returning what we have 28023 1726853612.55565: results queue empty 28023 1726853612.55566: checking for any_errors_fatal 28023 1726853612.55567: done checking for any_errors_fatal 28023 1726853612.55568: checking for max_fail_percentage 28023 1726853612.55570: done checking for max_fail_percentage 28023 1726853612.55572: checking to see if all hosts have failed and the running result is not ok 28023 1726853612.55573: done checking to see if all hosts have failed 28023 1726853612.55574: getting the remaining hosts for this loop 28023 1726853612.55575: done getting the remaining hosts for this loop 28023 1726853612.55583: getting the next task for host managed_node3 28023 1726853612.55619: done getting next task for host managed_node3 28023 1726853612.55621: ^ task is: TASK: meta (flush_handlers) 28023 1726853612.55623: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853612.55627: getting variables 28023 1726853612.55629: in VariableManager get_vars() 28023 1726853612.55664: Calling all_inventory to load vars for managed_node3 28023 1726853612.55820: Calling groups_inventory to load vars for managed_node3 28023 1726853612.55825: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853612.55836: Calling all_plugins_play to load vars for managed_node3 28023 1726853612.55839: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853612.55842: Calling groups_plugins_play to load vars for managed_node3 28023 1726853612.56185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853612.56670: done with get_vars() 28023 1726853612.56757: done getting variables 28023 1726853612.56861: in VariableManager get_vars() 28023 1726853612.56876: Calling all_inventory to load vars for managed_node3 28023 1726853612.56878: Calling groups_inventory to load vars for managed_node3 28023 1726853612.56880: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853612.56884: Calling all_plugins_play to load vars for managed_node3 28023 1726853612.56886: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853612.56888: Calling groups_plugins_play to load vars for managed_node3 28023 1726853612.57187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853612.57537: done with get_vars() 28023 1726853612.57550: done queuing things up, now waiting for results queue to drain 28023 1726853612.57551: results queue empty 28023 1726853612.57552: checking for any_errors_fatal 28023 1726853612.57674: done checking for any_errors_fatal 28023 1726853612.57675: checking for max_fail_percentage 28023 1726853612.57676: done checking for max_fail_percentage 28023 1726853612.57677: checking to see if all hosts have failed and the running result is not ok 28023 1726853612.57678: done checking to see if all hosts have failed 28023 1726853612.57682: getting the remaining hosts for this loop 28023 1726853612.57683: done getting the remaining hosts for this loop 28023 1726853612.57686: getting the next task for host managed_node3 28023 1726853612.57690: done getting next task for host managed_node3 28023 1726853612.57692: ^ task is: TASK: Set type and interface0 28023 1726853612.57694: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853612.57696: getting variables 28023 1726853612.57697: in VariableManager get_vars() 28023 1726853612.57718: Calling all_inventory to load vars for managed_node3 28023 1726853612.57720: Calling groups_inventory to load vars for managed_node3 28023 1726853612.57722: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853612.57727: Calling all_plugins_play to load vars for managed_node3 28023 1726853612.57729: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853612.57731: Calling groups_plugins_play to load vars for managed_node3 28023 1726853612.58002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853612.58845: done with get_vars() 28023 1726853612.58858: done getting variables 28023 1726853612.58903: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set type and interface0] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:11 Friday 20 September 2024 13:33:32 -0400 (0:00:01.105) 0:00:04.673 ****** 28023 1726853612.58930: entering _queue_task() for managed_node3/set_fact 28023 1726853612.60033: worker is 1 (out of 1 available) 28023 1726853612.60045: exiting _queue_task() for managed_node3/set_fact 28023 1726853612.60061: done queuing things up, now waiting for results queue to drain 28023 1726853612.60062: waiting for pending results... 28023 1726853612.60356: running TaskExecutor() for managed_node3/TASK: Set type and interface0 28023 1726853612.60451: in run() - task 02083763-bbaf-fdb6-dad7-00000000000b 28023 1726853612.60485: variable 'ansible_search_path' from source: unknown 28023 1726853612.60527: calling self._execute() 28023 1726853612.60619: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853612.60632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853612.60645: variable 'omit' from source: magic vars 28023 1726853612.60998: variable 'ansible_distribution_major_version' from source: facts 28023 1726853612.61018: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853612.61076: variable 'omit' from source: magic vars 28023 1726853612.61079: variable 'omit' from source: magic vars 28023 1726853612.61094: variable 'type' from source: play vars 28023 1726853612.61175: variable 'type' from source: play vars 28023 1726853612.61201: variable 'interface0' from source: play vars 28023 1726853612.61261: variable 'interface0' from source: play vars 28023 1726853612.61283: variable 'omit' from source: magic vars 28023 1726853612.61327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853612.61374: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853612.61450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853612.61456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853612.61459: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853612.61497: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853612.61504: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853612.61512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853612.61776: Set connection var ansible_shell_type to sh 28023 1726853612.61784: Set connection var ansible_shell_executable to /bin/sh 28023 1726853612.61786: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853612.61788: Set connection var ansible_connection to ssh 28023 1726853612.61790: Set connection var ansible_pipelining to False 28023 1726853612.61792: Set connection var ansible_timeout to 10 28023 1726853612.61793: variable 'ansible_shell_executable' from source: unknown 28023 1726853612.61795: variable 'ansible_connection' from source: unknown 28023 1726853612.61976: variable 'ansible_module_compression' from source: unknown 28023 1726853612.61979: variable 'ansible_shell_type' from source: unknown 28023 1726853612.61981: variable 'ansible_shell_executable' from source: unknown 28023 1726853612.61983: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853612.61985: variable 'ansible_pipelining' from source: unknown 28023 1726853612.61987: variable 'ansible_timeout' from source: unknown 28023 1726853612.61989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853612.62146: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853612.62277: variable 'omit' from source: magic vars 28023 1726853612.62280: starting attempt loop 28023 1726853612.62282: running the handler 28023 1726853612.62284: handler run complete 28023 1726853612.62285: attempt loop complete, returning result 28023 1726853612.62287: _execute() done 28023 1726853612.62289: dumping result to json 28023 1726853612.62290: done dumping result, returning 28023 1726853612.62293: done running TaskExecutor() for managed_node3/TASK: Set type and interface0 [02083763-bbaf-fdb6-dad7-00000000000b] 28023 1726853612.62328: sending task result for task 02083763-bbaf-fdb6-dad7-00000000000b ok: [managed_node3] => { "ansible_facts": { "interface": "ethtest0", "type": "veth" }, "changed": false } 28023 1726853612.62485: no more pending results, returning what we have 28023 1726853612.62488: results queue empty 28023 1726853612.62489: checking for any_errors_fatal 28023 1726853612.62491: done checking for any_errors_fatal 28023 1726853612.62492: checking for max_fail_percentage 28023 1726853612.62494: done checking for max_fail_percentage 28023 1726853612.62494: checking to see if all hosts have failed and the running result is not ok 28023 1726853612.62495: done checking to see if all hosts have failed 28023 1726853612.62496: getting the remaining hosts for this loop 28023 1726853612.62498: done getting the remaining hosts for this loop 28023 1726853612.62501: getting the next task for host managed_node3 28023 1726853612.62507: done getting next task for host managed_node3 28023 1726853612.62510: ^ task is: TASK: Show interfaces 28023 1726853612.62512: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853612.62516: getting variables 28023 1726853612.62517: in VariableManager get_vars() 28023 1726853612.62561: Calling all_inventory to load vars for managed_node3 28023 1726853612.62563: Calling groups_inventory to load vars for managed_node3 28023 1726853612.62566: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853612.62581: Calling all_plugins_play to load vars for managed_node3 28023 1726853612.62584: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853612.62587: Calling groups_plugins_play to load vars for managed_node3 28023 1726853612.63333: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000000b 28023 1726853612.63336: WORKER PROCESS EXITING 28023 1726853612.63359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853612.63554: done with get_vars() 28023 1726853612.63565: done getting variables TASK [Show interfaces] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:15 Friday 20 September 2024 13:33:32 -0400 (0:00:00.047) 0:00:04.720 ****** 28023 1726853612.63650: entering _queue_task() for managed_node3/include_tasks 28023 1726853612.64086: worker is 1 (out of 1 available) 28023 1726853612.64094: exiting _queue_task() for managed_node3/include_tasks 28023 1726853612.64106: done queuing things up, now waiting for results queue to drain 28023 1726853612.64107: waiting for pending results... 28023 1726853612.64154: running TaskExecutor() for managed_node3/TASK: Show interfaces 28023 1726853612.64250: in run() - task 02083763-bbaf-fdb6-dad7-00000000000c 28023 1726853612.64268: variable 'ansible_search_path' from source: unknown 28023 1726853612.64308: calling self._execute() 28023 1726853612.64395: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853612.64407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853612.64419: variable 'omit' from source: magic vars 28023 1726853612.64768: variable 'ansible_distribution_major_version' from source: facts 28023 1726853612.64788: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853612.64798: _execute() done 28023 1726853612.64805: dumping result to json 28023 1726853612.64813: done dumping result, returning 28023 1726853612.64823: done running TaskExecutor() for managed_node3/TASK: Show interfaces [02083763-bbaf-fdb6-dad7-00000000000c] 28023 1726853612.64833: sending task result for task 02083763-bbaf-fdb6-dad7-00000000000c 28023 1726853612.64943: no more pending results, returning what we have 28023 1726853612.64948: in VariableManager get_vars() 28023 1726853612.64995: Calling all_inventory to load vars for managed_node3 28023 1726853612.64998: Calling groups_inventory to load vars for managed_node3 28023 1726853612.65001: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853612.65015: Calling all_plugins_play to load vars for managed_node3 28023 1726853612.65018: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853612.65021: Calling groups_plugins_play to load vars for managed_node3 28023 1726853612.65337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853612.65665: done with get_vars() 28023 1726853612.65674: variable 'ansible_search_path' from source: unknown 28023 1726853612.65688: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000000c 28023 1726853612.65691: WORKER PROCESS EXITING 28023 1726853612.65698: we have included files to process 28023 1726853612.65699: generating all_blocks data 28023 1726853612.65700: done generating all_blocks data 28023 1726853612.65701: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28023 1726853612.65701: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28023 1726853612.65704: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28023 1726853612.65849: in VariableManager get_vars() 28023 1726853612.65898: done with get_vars() 28023 1726853612.66029: done processing included file 28023 1726853612.66031: iterating over new_blocks loaded from include file 28023 1726853612.66033: in VariableManager get_vars() 28023 1726853612.66050: done with get_vars() 28023 1726853612.66052: filtering new block on tags 28023 1726853612.66068: done filtering new block on tags 28023 1726853612.66090: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 28023 1726853612.66097: extending task lists for all hosts with included blocks 28023 1726853612.66289: done extending task lists 28023 1726853612.66290: done processing included files 28023 1726853612.66291: results queue empty 28023 1726853612.66292: checking for any_errors_fatal 28023 1726853612.66295: done checking for any_errors_fatal 28023 1726853612.66296: checking for max_fail_percentage 28023 1726853612.66297: done checking for max_fail_percentage 28023 1726853612.66297: checking to see if all hosts have failed and the running result is not ok 28023 1726853612.66298: done checking to see if all hosts have failed 28023 1726853612.66299: getting the remaining hosts for this loop 28023 1726853612.66300: done getting the remaining hosts for this loop 28023 1726853612.66303: getting the next task for host managed_node3 28023 1726853612.66306: done getting next task for host managed_node3 28023 1726853612.66309: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 28023 1726853612.66311: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853612.66313: getting variables 28023 1726853612.66316: in VariableManager get_vars() 28023 1726853612.66330: Calling all_inventory to load vars for managed_node3 28023 1726853612.66333: Calling groups_inventory to load vars for managed_node3 28023 1726853612.66335: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853612.66339: Calling all_plugins_play to load vars for managed_node3 28023 1726853612.66342: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853612.66344: Calling groups_plugins_play to load vars for managed_node3 28023 1726853612.66598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853612.66818: done with get_vars() 28023 1726853612.66827: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:33:32 -0400 (0:00:00.032) 0:00:04.752 ****** 28023 1726853612.66902: entering _queue_task() for managed_node3/include_tasks 28023 1726853612.67393: worker is 1 (out of 1 available) 28023 1726853612.67403: exiting _queue_task() for managed_node3/include_tasks 28023 1726853612.67413: done queuing things up, now waiting for results queue to drain 28023 1726853612.67415: waiting for pending results... 28023 1726853612.67591: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 28023 1726853612.67704: in run() - task 02083763-bbaf-fdb6-dad7-000000000135 28023 1726853612.67731: variable 'ansible_search_path' from source: unknown 28023 1726853612.67740: variable 'ansible_search_path' from source: unknown 28023 1726853612.67788: calling self._execute() 28023 1726853612.67882: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853612.67894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853612.67909: variable 'omit' from source: magic vars 28023 1726853612.68303: variable 'ansible_distribution_major_version' from source: facts 28023 1726853612.68324: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853612.68335: _execute() done 28023 1726853612.68343: dumping result to json 28023 1726853612.68350: done dumping result, returning 28023 1726853612.68362: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-fdb6-dad7-000000000135] 28023 1726853612.68373: sending task result for task 02083763-bbaf-fdb6-dad7-000000000135 28023 1726853612.68530: no more pending results, returning what we have 28023 1726853612.68536: in VariableManager get_vars() 28023 1726853612.68588: Calling all_inventory to load vars for managed_node3 28023 1726853612.68591: Calling groups_inventory to load vars for managed_node3 28023 1726853612.68594: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853612.68610: Calling all_plugins_play to load vars for managed_node3 28023 1726853612.68614: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853612.68617: Calling groups_plugins_play to load vars for managed_node3 28023 1726853612.68986: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000135 28023 1726853612.68992: WORKER PROCESS EXITING 28023 1726853612.69018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853612.69239: done with get_vars() 28023 1726853612.69246: variable 'ansible_search_path' from source: unknown 28023 1726853612.69247: variable 'ansible_search_path' from source: unknown 28023 1726853612.69284: we have included files to process 28023 1726853612.69285: generating all_blocks data 28023 1726853612.69286: done generating all_blocks data 28023 1726853612.69288: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28023 1726853612.69289: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28023 1726853612.69291: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28023 1726853612.69597: done processing included file 28023 1726853612.69600: iterating over new_blocks loaded from include file 28023 1726853612.69602: in VariableManager get_vars() 28023 1726853612.69619: done with get_vars() 28023 1726853612.69622: filtering new block on tags 28023 1726853612.69640: done filtering new block on tags 28023 1726853612.69643: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 28023 1726853612.69647: extending task lists for all hosts with included blocks 28023 1726853612.69746: done extending task lists 28023 1726853612.69747: done processing included files 28023 1726853612.69748: results queue empty 28023 1726853612.69749: checking for any_errors_fatal 28023 1726853612.69751: done checking for any_errors_fatal 28023 1726853612.69752: checking for max_fail_percentage 28023 1726853612.69753: done checking for max_fail_percentage 28023 1726853612.69754: checking to see if all hosts have failed and the running result is not ok 28023 1726853612.69755: done checking to see if all hosts have failed 28023 1726853612.69755: getting the remaining hosts for this loop 28023 1726853612.69757: done getting the remaining hosts for this loop 28023 1726853612.69759: getting the next task for host managed_node3 28023 1726853612.69763: done getting next task for host managed_node3 28023 1726853612.69765: ^ task is: TASK: Gather current interface info 28023 1726853612.69767: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853612.69770: getting variables 28023 1726853612.69772: in VariableManager get_vars() 28023 1726853612.69783: Calling all_inventory to load vars for managed_node3 28023 1726853612.69786: Calling groups_inventory to load vars for managed_node3 28023 1726853612.69787: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853612.69791: Calling all_plugins_play to load vars for managed_node3 28023 1726853612.69793: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853612.69795: Calling groups_plugins_play to load vars for managed_node3 28023 1726853612.69937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853612.70139: done with get_vars() 28023 1726853612.70148: done getting variables 28023 1726853612.70185: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:33:32 -0400 (0:00:00.033) 0:00:04.786 ****** 28023 1726853612.70212: entering _queue_task() for managed_node3/command 28023 1726853612.70462: worker is 1 (out of 1 available) 28023 1726853612.70577: exiting _queue_task() for managed_node3/command 28023 1726853612.70591: done queuing things up, now waiting for results queue to drain 28023 1726853612.70592: waiting for pending results... 28023 1726853612.70746: running TaskExecutor() for managed_node3/TASK: Gather current interface info 28023 1726853612.70856: in run() - task 02083763-bbaf-fdb6-dad7-00000000014e 28023 1726853612.70878: variable 'ansible_search_path' from source: unknown 28023 1726853612.70888: variable 'ansible_search_path' from source: unknown 28023 1726853612.70934: calling self._execute() 28023 1726853612.71017: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853612.71028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853612.71047: variable 'omit' from source: magic vars 28023 1726853612.71459: variable 'ansible_distribution_major_version' from source: facts 28023 1726853612.71483: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853612.71494: variable 'omit' from source: magic vars 28023 1726853612.71541: variable 'omit' from source: magic vars 28023 1726853612.71583: variable 'omit' from source: magic vars 28023 1726853612.71627: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853612.71667: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853612.71697: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853612.71719: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853612.71734: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853612.71766: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853612.71777: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853612.71784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853612.71881: Set connection var ansible_shell_type to sh 28023 1726853612.71893: Set connection var ansible_shell_executable to /bin/sh 28023 1726853612.72076: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853612.72079: Set connection var ansible_connection to ssh 28023 1726853612.72082: Set connection var ansible_pipelining to False 28023 1726853612.72084: Set connection var ansible_timeout to 10 28023 1726853612.72086: variable 'ansible_shell_executable' from source: unknown 28023 1726853612.72088: variable 'ansible_connection' from source: unknown 28023 1726853612.72090: variable 'ansible_module_compression' from source: unknown 28023 1726853612.72092: variable 'ansible_shell_type' from source: unknown 28023 1726853612.72094: variable 'ansible_shell_executable' from source: unknown 28023 1726853612.72096: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853612.72098: variable 'ansible_pipelining' from source: unknown 28023 1726853612.72100: variable 'ansible_timeout' from source: unknown 28023 1726853612.72102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853612.72134: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853612.72148: variable 'omit' from source: magic vars 28023 1726853612.72159: starting attempt loop 28023 1726853612.72165: running the handler 28023 1726853612.72188: _low_level_execute_command(): starting 28023 1726853612.72199: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853612.72994: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853612.73019: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853612.73044: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853612.73059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853612.73161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853612.74875: stdout chunk (state=3): >>>/root <<< 28023 1726853612.75077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853612.75422: stderr chunk (state=3): >>><<< 28023 1726853612.75425: stdout chunk (state=3): >>><<< 28023 1726853612.75428: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853612.75437: _low_level_execute_command(): starting 28023 1726853612.75440: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853612.7530904-28268-277424341282610 `" && echo ansible-tmp-1726853612.7530904-28268-277424341282610="` echo /root/.ansible/tmp/ansible-tmp-1726853612.7530904-28268-277424341282610 `" ) && sleep 0' 28023 1726853612.76593: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853612.76690: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853612.76705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853612.76723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853612.76766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853612.76921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853612.77005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853612.79038: stdout chunk (state=3): >>>ansible-tmp-1726853612.7530904-28268-277424341282610=/root/.ansible/tmp/ansible-tmp-1726853612.7530904-28268-277424341282610 <<< 28023 1726853612.79176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853612.79234: stderr chunk (state=3): >>><<< 28023 1726853612.79250: stdout chunk (state=3): >>><<< 28023 1726853612.79287: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853612.7530904-28268-277424341282610=/root/.ansible/tmp/ansible-tmp-1726853612.7530904-28268-277424341282610 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853612.79322: variable 'ansible_module_compression' from source: unknown 28023 1726853612.79396: ANSIBALLZ: Using generic lock for ansible.legacy.command 28023 1726853612.79478: ANSIBALLZ: Acquiring lock 28023 1726853612.79482: ANSIBALLZ: Lock acquired: 139729396667488 28023 1726853612.79484: ANSIBALLZ: Creating module 28023 1726853612.97379: ANSIBALLZ: Writing module into payload 28023 1726853612.97383: ANSIBALLZ: Writing module 28023 1726853612.97385: ANSIBALLZ: Renaming module 28023 1726853612.97387: ANSIBALLZ: Done creating module 28023 1726853612.97389: variable 'ansible_facts' from source: unknown 28023 1726853612.97399: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853612.7530904-28268-277424341282610/AnsiballZ_command.py 28023 1726853612.97993: Sending initial data 28023 1726853612.98002: Sent initial data (156 bytes) 28023 1726853612.98381: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853612.98397: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853612.98413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853612.98432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853612.98456: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853612.98468: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853612.98484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853612.98503: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28023 1726853612.98518: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 28023 1726853612.98525: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28023 1726853612.98565: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853612.98617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853612.98668: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853612.98797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853613.00521: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853613.00564: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853613.00647: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmp7f9t2kfa /root/.ansible/tmp/ansible-tmp-1726853612.7530904-28268-277424341282610/AnsiballZ_command.py <<< 28023 1726853613.00661: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853612.7530904-28268-277424341282610/AnsiballZ_command.py" <<< 28023 1726853613.00798: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmp7f9t2kfa" to remote "/root/.ansible/tmp/ansible-tmp-1726853612.7530904-28268-277424341282610/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853612.7530904-28268-277424341282610/AnsiballZ_command.py" <<< 28023 1726853613.02190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853613.02199: stdout chunk (state=3): >>><<< 28023 1726853613.02208: stderr chunk (state=3): >>><<< 28023 1726853613.02385: done transferring module to remote 28023 1726853613.02401: _low_level_execute_command(): starting 28023 1726853613.02410: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853612.7530904-28268-277424341282610/ /root/.ansible/tmp/ansible-tmp-1726853612.7530904-28268-277424341282610/AnsiballZ_command.py && sleep 0' 28023 1726853613.03889: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853613.04024: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853613.04119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853613.06022: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853613.06031: stdout chunk (state=3): >>><<< 28023 1726853613.06042: stderr chunk (state=3): >>><<< 28023 1726853613.06176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853613.06181: _low_level_execute_command(): starting 28023 1726853613.06184: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853612.7530904-28268-277424341282610/AnsiballZ_command.py && sleep 0' 28023 1726853613.07686: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853613.23513: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:33:33.229459", "end": "2024-09-20 13:33:33.232934", "delta": "0:00:00.003475", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28023 1726853613.25090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853613.25153: stderr chunk (state=3): >>>Shared connection to 10.31.11.217 closed. <<< 28023 1726853613.25165: stdout chunk (state=3): >>><<< 28023 1726853613.25180: stderr chunk (state=3): >>><<< 28023 1726853613.25205: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:33:33.229459", "end": "2024-09-20 13:33:33.232934", "delta": "0:00:00.003475", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853613.25324: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853612.7530904-28268-277424341282610/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853613.25332: _low_level_execute_command(): starting 28023 1726853613.25342: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853612.7530904-28268-277424341282610/ > /dev/null 2>&1 && sleep 0' 28023 1726853613.26789: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853613.26817: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853613.26840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853613.26941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853613.28886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853613.28897: stdout chunk (state=3): >>><<< 28023 1726853613.28909: stderr chunk (state=3): >>><<< 28023 1726853613.29080: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853613.29084: handler run complete 28023 1726853613.29087: Evaluated conditional (False): False 28023 1726853613.29089: attempt loop complete, returning result 28023 1726853613.29091: _execute() done 28023 1726853613.29093: dumping result to json 28023 1726853613.29094: done dumping result, returning 28023 1726853613.29096: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [02083763-bbaf-fdb6-dad7-00000000014e] 28023 1726853613.29098: sending task result for task 02083763-bbaf-fdb6-dad7-00000000014e ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003475", "end": "2024-09-20 13:33:33.232934", "rc": 0, "start": "2024-09-20 13:33:33.229459" } STDOUT: bonding_masters eth0 lo rpltstbr 28023 1726853613.29459: no more pending results, returning what we have 28023 1726853613.29462: results queue empty 28023 1726853613.29463: checking for any_errors_fatal 28023 1726853613.29465: done checking for any_errors_fatal 28023 1726853613.29465: checking for max_fail_percentage 28023 1726853613.29467: done checking for max_fail_percentage 28023 1726853613.29468: checking to see if all hosts have failed and the running result is not ok 28023 1726853613.29469: done checking to see if all hosts have failed 28023 1726853613.29469: getting the remaining hosts for this loop 28023 1726853613.29542: done getting the remaining hosts for this loop 28023 1726853613.29547: getting the next task for host managed_node3 28023 1726853613.29553: done getting next task for host managed_node3 28023 1726853613.29619: ^ task is: TASK: Set current_interfaces 28023 1726853613.29623: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853613.29631: getting variables 28023 1726853613.29633: in VariableManager get_vars() 28023 1726853613.29696: Calling all_inventory to load vars for managed_node3 28023 1726853613.29700: Calling groups_inventory to load vars for managed_node3 28023 1726853613.29702: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853613.29717: Calling all_plugins_play to load vars for managed_node3 28023 1726853613.29720: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853613.29723: Calling groups_plugins_play to load vars for managed_node3 28023 1726853613.29925: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000014e 28023 1726853613.29929: WORKER PROCESS EXITING 28023 1726853613.29958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853613.30416: done with get_vars() 28023 1726853613.30455: done getting variables 28023 1726853613.30519: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:33:33 -0400 (0:00:00.603) 0:00:05.389 ****** 28023 1726853613.30554: entering _queue_task() for managed_node3/set_fact 28023 1726853613.30950: worker is 1 (out of 1 available) 28023 1726853613.30962: exiting _queue_task() for managed_node3/set_fact 28023 1726853613.31084: done queuing things up, now waiting for results queue to drain 28023 1726853613.31086: waiting for pending results... 28023 1726853613.31165: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 28023 1726853613.31283: in run() - task 02083763-bbaf-fdb6-dad7-00000000014f 28023 1726853613.31320: variable 'ansible_search_path' from source: unknown 28023 1726853613.31328: variable 'ansible_search_path' from source: unknown 28023 1726853613.31368: calling self._execute() 28023 1726853613.31519: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853613.31580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853613.31583: variable 'omit' from source: magic vars 28023 1726853613.31976: variable 'ansible_distribution_major_version' from source: facts 28023 1726853613.31979: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853613.31982: variable 'omit' from source: magic vars 28023 1726853613.31985: variable 'omit' from source: magic vars 28023 1726853613.32021: variable '_current_interfaces' from source: set_fact 28023 1726853613.32096: variable 'omit' from source: magic vars 28023 1726853613.32138: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853613.32184: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853613.32211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853613.32236: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853613.32258: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853613.32294: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853613.32302: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853613.32308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853613.32404: Set connection var ansible_shell_type to sh 28023 1726853613.32416: Set connection var ansible_shell_executable to /bin/sh 28023 1726853613.32425: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853613.32434: Set connection var ansible_connection to ssh 28023 1726853613.32443: Set connection var ansible_pipelining to False 28023 1726853613.32453: Set connection var ansible_timeout to 10 28023 1726853613.32487: variable 'ansible_shell_executable' from source: unknown 28023 1726853613.32495: variable 'ansible_connection' from source: unknown 28023 1726853613.32501: variable 'ansible_module_compression' from source: unknown 28023 1726853613.32506: variable 'ansible_shell_type' from source: unknown 28023 1726853613.32512: variable 'ansible_shell_executable' from source: unknown 28023 1726853613.32517: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853613.32523: variable 'ansible_pipelining' from source: unknown 28023 1726853613.32529: variable 'ansible_timeout' from source: unknown 28023 1726853613.32535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853613.32683: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853613.32697: variable 'omit' from source: magic vars 28023 1726853613.32707: starting attempt loop 28023 1726853613.32814: running the handler 28023 1726853613.32816: handler run complete 28023 1726853613.32818: attempt loop complete, returning result 28023 1726853613.32820: _execute() done 28023 1726853613.32821: dumping result to json 28023 1726853613.32823: done dumping result, returning 28023 1726853613.32825: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [02083763-bbaf-fdb6-dad7-00000000014f] 28023 1726853613.32827: sending task result for task 02083763-bbaf-fdb6-dad7-00000000014f 28023 1726853613.32917: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000014f 28023 1726853613.32920: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 28023 1726853613.32987: no more pending results, returning what we have 28023 1726853613.32989: results queue empty 28023 1726853613.32990: checking for any_errors_fatal 28023 1726853613.32996: done checking for any_errors_fatal 28023 1726853613.32997: checking for max_fail_percentage 28023 1726853613.32998: done checking for max_fail_percentage 28023 1726853613.32999: checking to see if all hosts have failed and the running result is not ok 28023 1726853613.33000: done checking to see if all hosts have failed 28023 1726853613.33001: getting the remaining hosts for this loop 28023 1726853613.33002: done getting the remaining hosts for this loop 28023 1726853613.33005: getting the next task for host managed_node3 28023 1726853613.33011: done getting next task for host managed_node3 28023 1726853613.33013: ^ task is: TASK: Show current_interfaces 28023 1726853613.33016: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853613.33019: getting variables 28023 1726853613.33020: in VariableManager get_vars() 28023 1726853613.33055: Calling all_inventory to load vars for managed_node3 28023 1726853613.33058: Calling groups_inventory to load vars for managed_node3 28023 1726853613.33060: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853613.33078: Calling all_plugins_play to load vars for managed_node3 28023 1726853613.33081: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853613.33083: Calling groups_plugins_play to load vars for managed_node3 28023 1726853613.33202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853613.33408: done with get_vars() 28023 1726853613.33419: done getting variables 28023 1726853613.33509: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:33:33 -0400 (0:00:00.029) 0:00:05.419 ****** 28023 1726853613.33539: entering _queue_task() for managed_node3/debug 28023 1726853613.33541: Creating lock for debug 28023 1726853613.33839: worker is 1 (out of 1 available) 28023 1726853613.33852: exiting _queue_task() for managed_node3/debug 28023 1726853613.33863: done queuing things up, now waiting for results queue to drain 28023 1726853613.33864: waiting for pending results... 28023 1726853613.34297: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 28023 1726853613.34329: in run() - task 02083763-bbaf-fdb6-dad7-000000000136 28023 1726853613.34333: variable 'ansible_search_path' from source: unknown 28023 1726853613.34336: variable 'ansible_search_path' from source: unknown 28023 1726853613.34340: calling self._execute() 28023 1726853613.34348: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853613.34357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853613.34362: variable 'omit' from source: magic vars 28023 1726853613.34634: variable 'ansible_distribution_major_version' from source: facts 28023 1726853613.34637: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853613.34644: variable 'omit' from source: magic vars 28023 1726853613.34674: variable 'omit' from source: magic vars 28023 1726853613.34759: variable 'current_interfaces' from source: set_fact 28023 1726853613.34763: variable 'omit' from source: magic vars 28023 1726853613.34799: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853613.35002: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853613.35005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853613.35008: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853613.35277: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853613.35280: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853613.35283: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853613.35285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853613.35326: Set connection var ansible_shell_type to sh 28023 1726853613.35340: Set connection var ansible_shell_executable to /bin/sh 28023 1726853613.35350: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853613.35361: Set connection var ansible_connection to ssh 28023 1726853613.35676: Set connection var ansible_pipelining to False 28023 1726853613.35679: Set connection var ansible_timeout to 10 28023 1726853613.35681: variable 'ansible_shell_executable' from source: unknown 28023 1726853613.35684: variable 'ansible_connection' from source: unknown 28023 1726853613.35685: variable 'ansible_module_compression' from source: unknown 28023 1726853613.35687: variable 'ansible_shell_type' from source: unknown 28023 1726853613.35689: variable 'ansible_shell_executable' from source: unknown 28023 1726853613.35691: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853613.35693: variable 'ansible_pipelining' from source: unknown 28023 1726853613.35695: variable 'ansible_timeout' from source: unknown 28023 1726853613.35697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853613.35803: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853613.35819: variable 'omit' from source: magic vars 28023 1726853613.36076: starting attempt loop 28023 1726853613.36080: running the handler 28023 1726853613.36082: handler run complete 28023 1726853613.36084: attempt loop complete, returning result 28023 1726853613.36086: _execute() done 28023 1726853613.36088: dumping result to json 28023 1726853613.36090: done dumping result, returning 28023 1726853613.36093: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [02083763-bbaf-fdb6-dad7-000000000136] 28023 1726853613.36097: sending task result for task 02083763-bbaf-fdb6-dad7-000000000136 ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 28023 1726853613.36260: no more pending results, returning what we have 28023 1726853613.36264: results queue empty 28023 1726853613.36264: checking for any_errors_fatal 28023 1726853613.36274: done checking for any_errors_fatal 28023 1726853613.36275: checking for max_fail_percentage 28023 1726853613.36277: done checking for max_fail_percentage 28023 1726853613.36278: checking to see if all hosts have failed and the running result is not ok 28023 1726853613.36279: done checking to see if all hosts have failed 28023 1726853613.36279: getting the remaining hosts for this loop 28023 1726853613.36281: done getting the remaining hosts for this loop 28023 1726853613.36285: getting the next task for host managed_node3 28023 1726853613.36293: done getting next task for host managed_node3 28023 1726853613.36301: ^ task is: TASK: Manage test interface 28023 1726853613.36303: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853613.36311: getting variables 28023 1726853613.36313: in VariableManager get_vars() 28023 1726853613.36358: Calling all_inventory to load vars for managed_node3 28023 1726853613.36361: Calling groups_inventory to load vars for managed_node3 28023 1726853613.36363: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853613.36690: Calling all_plugins_play to load vars for managed_node3 28023 1726853613.36694: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853613.36699: Calling groups_plugins_play to load vars for managed_node3 28023 1726853613.36977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853613.37210: done with get_vars() 28023 1726853613.37221: done getting variables 28023 1726853613.37266: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000136 28023 1726853613.37269: WORKER PROCESS EXITING TASK [Manage test interface] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:17 Friday 20 September 2024 13:33:33 -0400 (0:00:00.038) 0:00:05.457 ****** 28023 1726853613.37364: entering _queue_task() for managed_node3/include_tasks 28023 1726853613.37636: worker is 1 (out of 1 available) 28023 1726853613.37648: exiting _queue_task() for managed_node3/include_tasks 28023 1726853613.37664: done queuing things up, now waiting for results queue to drain 28023 1726853613.37666: waiting for pending results... 28023 1726853613.37969: running TaskExecutor() for managed_node3/TASK: Manage test interface 28023 1726853613.38077: in run() - task 02083763-bbaf-fdb6-dad7-00000000000d 28023 1726853613.38100: variable 'ansible_search_path' from source: unknown 28023 1726853613.38148: calling self._execute() 28023 1726853613.38286: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853613.38341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853613.38382: variable 'omit' from source: magic vars 28023 1726853613.39063: variable 'ansible_distribution_major_version' from source: facts 28023 1726853613.39100: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853613.39119: _execute() done 28023 1726853613.39129: dumping result to json 28023 1726853613.39137: done dumping result, returning 28023 1726853613.39151: done running TaskExecutor() for managed_node3/TASK: Manage test interface [02083763-bbaf-fdb6-dad7-00000000000d] 28023 1726853613.39163: sending task result for task 02083763-bbaf-fdb6-dad7-00000000000d 28023 1726853613.39302: no more pending results, returning what we have 28023 1726853613.39308: in VariableManager get_vars() 28023 1726853613.39368: Calling all_inventory to load vars for managed_node3 28023 1726853613.39374: Calling groups_inventory to load vars for managed_node3 28023 1726853613.39377: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853613.39392: Calling all_plugins_play to load vars for managed_node3 28023 1726853613.39396: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853613.39399: Calling groups_plugins_play to load vars for managed_node3 28023 1726853613.40121: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000000d 28023 1726853613.40125: WORKER PROCESS EXITING 28023 1726853613.40204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853613.40421: done with get_vars() 28023 1726853613.40429: variable 'ansible_search_path' from source: unknown 28023 1726853613.40449: we have included files to process 28023 1726853613.40450: generating all_blocks data 28023 1726853613.40451: done generating all_blocks data 28023 1726853613.40462: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 28023 1726853613.40463: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 28023 1726853613.40466: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 28023 1726853613.41617: in VariableManager get_vars() 28023 1726853613.41641: done with get_vars() 28023 1726853613.42594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 28023 1726853613.43569: done processing included file 28023 1726853613.43575: iterating over new_blocks loaded from include file 28023 1726853613.43577: in VariableManager get_vars() 28023 1726853613.43597: done with get_vars() 28023 1726853613.43599: filtering new block on tags 28023 1726853613.43632: done filtering new block on tags 28023 1726853613.43634: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 28023 1726853613.43640: extending task lists for all hosts with included blocks 28023 1726853613.44074: done extending task lists 28023 1726853613.44076: done processing included files 28023 1726853613.44077: results queue empty 28023 1726853613.44077: checking for any_errors_fatal 28023 1726853613.44080: done checking for any_errors_fatal 28023 1726853613.44081: checking for max_fail_percentage 28023 1726853613.44082: done checking for max_fail_percentage 28023 1726853613.44082: checking to see if all hosts have failed and the running result is not ok 28023 1726853613.44084: done checking to see if all hosts have failed 28023 1726853613.44084: getting the remaining hosts for this loop 28023 1726853613.44086: done getting the remaining hosts for this loop 28023 1726853613.44173: getting the next task for host managed_node3 28023 1726853613.44179: done getting next task for host managed_node3 28023 1726853613.44182: ^ task is: TASK: Ensure state in ["present", "absent"] 28023 1726853613.44184: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853613.44186: getting variables 28023 1726853613.44187: in VariableManager get_vars() 28023 1726853613.44225: Calling all_inventory to load vars for managed_node3 28023 1726853613.44228: Calling groups_inventory to load vars for managed_node3 28023 1726853613.44230: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853613.44237: Calling all_plugins_play to load vars for managed_node3 28023 1726853613.44239: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853613.44242: Calling groups_plugins_play to load vars for managed_node3 28023 1726853613.44460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853613.44731: done with get_vars() 28023 1726853613.44749: done getting variables 28023 1726853613.44820: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 13:33:33 -0400 (0:00:00.074) 0:00:05.532 ****** 28023 1726853613.44847: entering _queue_task() for managed_node3/fail 28023 1726853613.44853: Creating lock for fail 28023 1726853613.45157: worker is 1 (out of 1 available) 28023 1726853613.45168: exiting _queue_task() for managed_node3/fail 28023 1726853613.45182: done queuing things up, now waiting for results queue to drain 28023 1726853613.45183: waiting for pending results... 28023 1726853613.45336: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 28023 1726853613.45412: in run() - task 02083763-bbaf-fdb6-dad7-00000000016a 28023 1726853613.45420: variable 'ansible_search_path' from source: unknown 28023 1726853613.45423: variable 'ansible_search_path' from source: unknown 28023 1726853613.45452: calling self._execute() 28023 1726853613.45520: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853613.45528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853613.45533: variable 'omit' from source: magic vars 28023 1726853613.45797: variable 'ansible_distribution_major_version' from source: facts 28023 1726853613.45807: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853613.45901: variable 'state' from source: include params 28023 1726853613.45906: Evaluated conditional (state not in ["present", "absent"]): False 28023 1726853613.45909: when evaluation is False, skipping this task 28023 1726853613.45912: _execute() done 28023 1726853613.45916: dumping result to json 28023 1726853613.45919: done dumping result, returning 28023 1726853613.45926: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [02083763-bbaf-fdb6-dad7-00000000016a] 28023 1726853613.45931: sending task result for task 02083763-bbaf-fdb6-dad7-00000000016a 28023 1726853613.46013: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000016a 28023 1726853613.46015: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 28023 1726853613.46100: no more pending results, returning what we have 28023 1726853613.46104: results queue empty 28023 1726853613.46105: checking for any_errors_fatal 28023 1726853613.46106: done checking for any_errors_fatal 28023 1726853613.46107: checking for max_fail_percentage 28023 1726853613.46108: done checking for max_fail_percentage 28023 1726853613.46109: checking to see if all hosts have failed and the running result is not ok 28023 1726853613.46109: done checking to see if all hosts have failed 28023 1726853613.46110: getting the remaining hosts for this loop 28023 1726853613.46112: done getting the remaining hosts for this loop 28023 1726853613.46115: getting the next task for host managed_node3 28023 1726853613.46120: done getting next task for host managed_node3 28023 1726853613.46122: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 28023 1726853613.46127: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853613.46130: getting variables 28023 1726853613.46131: in VariableManager get_vars() 28023 1726853613.46162: Calling all_inventory to load vars for managed_node3 28023 1726853613.46165: Calling groups_inventory to load vars for managed_node3 28023 1726853613.46167: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853613.46177: Calling all_plugins_play to load vars for managed_node3 28023 1726853613.46180: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853613.46182: Calling groups_plugins_play to load vars for managed_node3 28023 1726853613.46299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853613.46426: done with get_vars() 28023 1726853613.46433: done getting variables 28023 1726853613.46496: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 13:33:33 -0400 (0:00:00.016) 0:00:05.549 ****** 28023 1726853613.46544: entering _queue_task() for managed_node3/fail 28023 1726853613.46907: worker is 1 (out of 1 available) 28023 1726853613.46920: exiting _queue_task() for managed_node3/fail 28023 1726853613.46931: done queuing things up, now waiting for results queue to drain 28023 1726853613.46932: waiting for pending results... 28023 1726853613.47488: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 28023 1726853613.47498: in run() - task 02083763-bbaf-fdb6-dad7-00000000016b 28023 1726853613.47598: variable 'ansible_search_path' from source: unknown 28023 1726853613.47601: variable 'ansible_search_path' from source: unknown 28023 1726853613.47743: calling self._execute() 28023 1726853613.47828: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853613.47864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853613.47868: variable 'omit' from source: magic vars 28023 1726853613.48193: variable 'ansible_distribution_major_version' from source: facts 28023 1726853613.48203: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853613.48309: variable 'type' from source: set_fact 28023 1726853613.48313: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 28023 1726853613.48316: when evaluation is False, skipping this task 28023 1726853613.48318: _execute() done 28023 1726853613.48323: dumping result to json 28023 1726853613.48328: done dumping result, returning 28023 1726853613.48332: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [02083763-bbaf-fdb6-dad7-00000000016b] 28023 1726853613.48337: sending task result for task 02083763-bbaf-fdb6-dad7-00000000016b 28023 1726853613.48416: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000016b 28023 1726853613.48418: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 28023 1726853613.48478: no more pending results, returning what we have 28023 1726853613.48482: results queue empty 28023 1726853613.48483: checking for any_errors_fatal 28023 1726853613.48488: done checking for any_errors_fatal 28023 1726853613.48489: checking for max_fail_percentage 28023 1726853613.48490: done checking for max_fail_percentage 28023 1726853613.48491: checking to see if all hosts have failed and the running result is not ok 28023 1726853613.48492: done checking to see if all hosts have failed 28023 1726853613.48493: getting the remaining hosts for this loop 28023 1726853613.48495: done getting the remaining hosts for this loop 28023 1726853613.48498: getting the next task for host managed_node3 28023 1726853613.48503: done getting next task for host managed_node3 28023 1726853613.48505: ^ task is: TASK: Include the task 'show_interfaces.yml' 28023 1726853613.48508: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853613.48511: getting variables 28023 1726853613.48512: in VariableManager get_vars() 28023 1726853613.48547: Calling all_inventory to load vars for managed_node3 28023 1726853613.48549: Calling groups_inventory to load vars for managed_node3 28023 1726853613.48551: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853613.48563: Calling all_plugins_play to load vars for managed_node3 28023 1726853613.48565: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853613.48568: Calling groups_plugins_play to load vars for managed_node3 28023 1726853613.48728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853613.48851: done with get_vars() 28023 1726853613.48860: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 13:33:33 -0400 (0:00:00.023) 0:00:05.573 ****** 28023 1726853613.48921: entering _queue_task() for managed_node3/include_tasks 28023 1726853613.49100: worker is 1 (out of 1 available) 28023 1726853613.49112: exiting _queue_task() for managed_node3/include_tasks 28023 1726853613.49125: done queuing things up, now waiting for results queue to drain 28023 1726853613.49127: waiting for pending results... 28023 1726853613.49269: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 28023 1726853613.49330: in run() - task 02083763-bbaf-fdb6-dad7-00000000016c 28023 1726853613.49340: variable 'ansible_search_path' from source: unknown 28023 1726853613.49344: variable 'ansible_search_path' from source: unknown 28023 1726853613.49374: calling self._execute() 28023 1726853613.49430: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853613.49435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853613.49443: variable 'omit' from source: magic vars 28023 1726853613.49693: variable 'ansible_distribution_major_version' from source: facts 28023 1726853613.49703: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853613.49709: _execute() done 28023 1726853613.49711: dumping result to json 28023 1726853613.49714: done dumping result, returning 28023 1726853613.49720: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-fdb6-dad7-00000000016c] 28023 1726853613.49725: sending task result for task 02083763-bbaf-fdb6-dad7-00000000016c 28023 1726853613.49806: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000016c 28023 1726853613.49809: WORKER PROCESS EXITING 28023 1726853613.49833: no more pending results, returning what we have 28023 1726853613.49838: in VariableManager get_vars() 28023 1726853613.49883: Calling all_inventory to load vars for managed_node3 28023 1726853613.49886: Calling groups_inventory to load vars for managed_node3 28023 1726853613.49887: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853613.49896: Calling all_plugins_play to load vars for managed_node3 28023 1726853613.49898: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853613.49901: Calling groups_plugins_play to load vars for managed_node3 28023 1726853613.50023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853613.50148: done with get_vars() 28023 1726853613.50156: variable 'ansible_search_path' from source: unknown 28023 1726853613.50157: variable 'ansible_search_path' from source: unknown 28023 1726853613.50181: we have included files to process 28023 1726853613.50182: generating all_blocks data 28023 1726853613.50183: done generating all_blocks data 28023 1726853613.50186: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28023 1726853613.50186: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28023 1726853613.50187: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28023 1726853613.50248: in VariableManager get_vars() 28023 1726853613.50267: done with get_vars() 28023 1726853613.50337: done processing included file 28023 1726853613.50339: iterating over new_blocks loaded from include file 28023 1726853613.50339: in VariableManager get_vars() 28023 1726853613.50378: done with get_vars() 28023 1726853613.50380: filtering new block on tags 28023 1726853613.50391: done filtering new block on tags 28023 1726853613.50392: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 28023 1726853613.50396: extending task lists for all hosts with included blocks 28023 1726853613.50614: done extending task lists 28023 1726853613.50615: done processing included files 28023 1726853613.50616: results queue empty 28023 1726853613.50616: checking for any_errors_fatal 28023 1726853613.50618: done checking for any_errors_fatal 28023 1726853613.50618: checking for max_fail_percentage 28023 1726853613.50619: done checking for max_fail_percentage 28023 1726853613.50620: checking to see if all hosts have failed and the running result is not ok 28023 1726853613.50620: done checking to see if all hosts have failed 28023 1726853613.50621: getting the remaining hosts for this loop 28023 1726853613.50621: done getting the remaining hosts for this loop 28023 1726853613.50623: getting the next task for host managed_node3 28023 1726853613.50626: done getting next task for host managed_node3 28023 1726853613.50627: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 28023 1726853613.50629: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853613.50631: getting variables 28023 1726853613.50632: in VariableManager get_vars() 28023 1726853613.50640: Calling all_inventory to load vars for managed_node3 28023 1726853613.50641: Calling groups_inventory to load vars for managed_node3 28023 1726853613.50643: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853613.50646: Calling all_plugins_play to load vars for managed_node3 28023 1726853613.50647: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853613.50649: Calling groups_plugins_play to load vars for managed_node3 28023 1726853613.50735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853613.50851: done with get_vars() 28023 1726853613.50859: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:33:33 -0400 (0:00:00.019) 0:00:05.592 ****** 28023 1726853613.50906: entering _queue_task() for managed_node3/include_tasks 28023 1726853613.51086: worker is 1 (out of 1 available) 28023 1726853613.51100: exiting _queue_task() for managed_node3/include_tasks 28023 1726853613.51113: done queuing things up, now waiting for results queue to drain 28023 1726853613.51115: waiting for pending results... 28023 1726853613.51256: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 28023 1726853613.51320: in run() - task 02083763-bbaf-fdb6-dad7-00000000019d 28023 1726853613.51329: variable 'ansible_search_path' from source: unknown 28023 1726853613.51334: variable 'ansible_search_path' from source: unknown 28023 1726853613.51362: calling self._execute() 28023 1726853613.51423: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853613.51426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853613.51436: variable 'omit' from source: magic vars 28023 1726853613.51712: variable 'ansible_distribution_major_version' from source: facts 28023 1726853613.51723: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853613.51728: _execute() done 28023 1726853613.51731: dumping result to json 28023 1726853613.51734: done dumping result, returning 28023 1726853613.51741: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-fdb6-dad7-00000000019d] 28023 1726853613.51746: sending task result for task 02083763-bbaf-fdb6-dad7-00000000019d 28023 1726853613.51824: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000019d 28023 1726853613.51827: WORKER PROCESS EXITING 28023 1726853613.51852: no more pending results, returning what we have 28023 1726853613.51858: in VariableManager get_vars() 28023 1726853613.51901: Calling all_inventory to load vars for managed_node3 28023 1726853613.51904: Calling groups_inventory to load vars for managed_node3 28023 1726853613.51906: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853613.51915: Calling all_plugins_play to load vars for managed_node3 28023 1726853613.51918: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853613.51920: Calling groups_plugins_play to load vars for managed_node3 28023 1726853613.52070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853613.52198: done with get_vars() 28023 1726853613.52203: variable 'ansible_search_path' from source: unknown 28023 1726853613.52204: variable 'ansible_search_path' from source: unknown 28023 1726853613.52240: we have included files to process 28023 1726853613.52240: generating all_blocks data 28023 1726853613.52241: done generating all_blocks data 28023 1726853613.52242: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28023 1726853613.52243: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28023 1726853613.52244: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28023 1726853613.52414: done processing included file 28023 1726853613.52416: iterating over new_blocks loaded from include file 28023 1726853613.52417: in VariableManager get_vars() 28023 1726853613.52428: done with get_vars() 28023 1726853613.52429: filtering new block on tags 28023 1726853613.52440: done filtering new block on tags 28023 1726853613.52441: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 28023 1726853613.52444: extending task lists for all hosts with included blocks 28023 1726853613.52530: done extending task lists 28023 1726853613.52531: done processing included files 28023 1726853613.52532: results queue empty 28023 1726853613.52532: checking for any_errors_fatal 28023 1726853613.52535: done checking for any_errors_fatal 28023 1726853613.52535: checking for max_fail_percentage 28023 1726853613.52536: done checking for max_fail_percentage 28023 1726853613.52536: checking to see if all hosts have failed and the running result is not ok 28023 1726853613.52537: done checking to see if all hosts have failed 28023 1726853613.52537: getting the remaining hosts for this loop 28023 1726853613.52538: done getting the remaining hosts for this loop 28023 1726853613.52539: getting the next task for host managed_node3 28023 1726853613.52542: done getting next task for host managed_node3 28023 1726853613.52543: ^ task is: TASK: Gather current interface info 28023 1726853613.52546: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853613.52547: getting variables 28023 1726853613.52548: in VariableManager get_vars() 28023 1726853613.52558: Calling all_inventory to load vars for managed_node3 28023 1726853613.52559: Calling groups_inventory to load vars for managed_node3 28023 1726853613.52560: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853613.52563: Calling all_plugins_play to load vars for managed_node3 28023 1726853613.52564: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853613.52566: Calling groups_plugins_play to load vars for managed_node3 28023 1726853613.52656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853613.52792: done with get_vars() 28023 1726853613.52798: done getting variables 28023 1726853613.52823: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:33:33 -0400 (0:00:00.019) 0:00:05.612 ****** 28023 1726853613.52844: entering _queue_task() for managed_node3/command 28023 1726853613.53028: worker is 1 (out of 1 available) 28023 1726853613.53041: exiting _queue_task() for managed_node3/command 28023 1726853613.53053: done queuing things up, now waiting for results queue to drain 28023 1726853613.53057: waiting for pending results... 28023 1726853613.53196: running TaskExecutor() for managed_node3/TASK: Gather current interface info 28023 1726853613.53261: in run() - task 02083763-bbaf-fdb6-dad7-0000000001d4 28023 1726853613.53272: variable 'ansible_search_path' from source: unknown 28023 1726853613.53276: variable 'ansible_search_path' from source: unknown 28023 1726853613.53304: calling self._execute() 28023 1726853613.53364: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853613.53368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853613.53378: variable 'omit' from source: magic vars 28023 1726853613.53626: variable 'ansible_distribution_major_version' from source: facts 28023 1726853613.53637: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853613.53643: variable 'omit' from source: magic vars 28023 1726853613.53677: variable 'omit' from source: magic vars 28023 1726853613.53700: variable 'omit' from source: magic vars 28023 1726853613.53731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853613.53759: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853613.53774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853613.53787: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853613.53797: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853613.53818: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853613.53821: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853613.53825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853613.53891: Set connection var ansible_shell_type to sh 28023 1726853613.53898: Set connection var ansible_shell_executable to /bin/sh 28023 1726853613.53903: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853613.53908: Set connection var ansible_connection to ssh 28023 1726853613.53913: Set connection var ansible_pipelining to False 28023 1726853613.53918: Set connection var ansible_timeout to 10 28023 1726853613.53941: variable 'ansible_shell_executable' from source: unknown 28023 1726853613.53944: variable 'ansible_connection' from source: unknown 28023 1726853613.53947: variable 'ansible_module_compression' from source: unknown 28023 1726853613.53950: variable 'ansible_shell_type' from source: unknown 28023 1726853613.53952: variable 'ansible_shell_executable' from source: unknown 28023 1726853613.53956: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853613.53959: variable 'ansible_pipelining' from source: unknown 28023 1726853613.53961: variable 'ansible_timeout' from source: unknown 28023 1726853613.53963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853613.54052: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853613.54059: variable 'omit' from source: magic vars 28023 1726853613.54061: starting attempt loop 28023 1726853613.54064: running the handler 28023 1726853613.54081: _low_level_execute_command(): starting 28023 1726853613.54088: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853613.54564: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853613.54606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853613.54609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853613.54612: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853613.54616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853613.54618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853613.54661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853613.54665: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853613.54667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853613.54743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853613.56458: stdout chunk (state=3): >>>/root <<< 28023 1726853613.56562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853613.56590: stderr chunk (state=3): >>><<< 28023 1726853613.56593: stdout chunk (state=3): >>><<< 28023 1726853613.56613: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853613.56622: _low_level_execute_command(): starting 28023 1726853613.56627: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853613.5661154-28332-55027005240590 `" && echo ansible-tmp-1726853613.5661154-28332-55027005240590="` echo /root/.ansible/tmp/ansible-tmp-1726853613.5661154-28332-55027005240590 `" ) && sleep 0' 28023 1726853613.57040: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853613.57081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853613.57084: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853613.57094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853613.57097: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 28023 1726853613.57099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853613.57102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853613.57142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853613.57149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853613.57151: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853613.57209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853613.59174: stdout chunk (state=3): >>>ansible-tmp-1726853613.5661154-28332-55027005240590=/root/.ansible/tmp/ansible-tmp-1726853613.5661154-28332-55027005240590 <<< 28023 1726853613.59279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853613.59302: stderr chunk (state=3): >>><<< 28023 1726853613.59305: stdout chunk (state=3): >>><<< 28023 1726853613.59318: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853613.5661154-28332-55027005240590=/root/.ansible/tmp/ansible-tmp-1726853613.5661154-28332-55027005240590 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853613.59347: variable 'ansible_module_compression' from source: unknown 28023 1726853613.59389: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28023 1726853613.59418: variable 'ansible_facts' from source: unknown 28023 1726853613.59481: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853613.5661154-28332-55027005240590/AnsiballZ_command.py 28023 1726853613.59580: Sending initial data 28023 1726853613.59585: Sent initial data (155 bytes) 28023 1726853613.60024: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853613.60027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853613.60029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853613.60033: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853613.60035: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853613.60078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853613.60097: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853613.60147: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853613.61770: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853613.61825: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853613.61886: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpfh76et_a /root/.ansible/tmp/ansible-tmp-1726853613.5661154-28332-55027005240590/AnsiballZ_command.py <<< 28023 1726853613.61893: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853613.5661154-28332-55027005240590/AnsiballZ_command.py" <<< 28023 1726853613.61951: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpfh76et_a" to remote "/root/.ansible/tmp/ansible-tmp-1726853613.5661154-28332-55027005240590/AnsiballZ_command.py" <<< 28023 1726853613.61953: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853613.5661154-28332-55027005240590/AnsiballZ_command.py" <<< 28023 1726853613.62543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853613.62585: stderr chunk (state=3): >>><<< 28023 1726853613.62589: stdout chunk (state=3): >>><<< 28023 1726853613.62606: done transferring module to remote 28023 1726853613.62614: _low_level_execute_command(): starting 28023 1726853613.62618: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853613.5661154-28332-55027005240590/ /root/.ansible/tmp/ansible-tmp-1726853613.5661154-28332-55027005240590/AnsiballZ_command.py && sleep 0' 28023 1726853613.63053: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853613.63057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853613.63060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 28023 1726853613.63065: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853613.63067: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853613.63114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853613.63117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853613.63186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853613.65028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853613.65053: stderr chunk (state=3): >>><<< 28023 1726853613.65056: stdout chunk (state=3): >>><<< 28023 1726853613.65073: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853613.65076: _low_level_execute_command(): starting 28023 1726853613.65080: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853613.5661154-28332-55027005240590/AnsiballZ_command.py && sleep 0' 28023 1726853613.65499: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853613.65503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853613.65505: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853613.65507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853613.65552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853613.65560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853613.65625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853613.81453: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:33:33.809896", "end": "2024-09-20 13:33:33.813355", "delta": "0:00:00.003459", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28023 1726853613.83064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853613.83094: stderr chunk (state=3): >>><<< 28023 1726853613.83098: stdout chunk (state=3): >>><<< 28023 1726853613.83116: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:33:33.809896", "end": "2024-09-20 13:33:33.813355", "delta": "0:00:00.003459", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853613.83146: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853613.5661154-28332-55027005240590/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853613.83152: _low_level_execute_command(): starting 28023 1726853613.83160: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853613.5661154-28332-55027005240590/ > /dev/null 2>&1 && sleep 0' 28023 1726853613.83612: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853613.83616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853613.83622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28023 1726853613.83624: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853613.83626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853613.83682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853613.83687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853613.83748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853613.85622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853613.85648: stderr chunk (state=3): >>><<< 28023 1726853613.85651: stdout chunk (state=3): >>><<< 28023 1726853613.85665: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853613.85672: handler run complete 28023 1726853613.85689: Evaluated conditional (False): False 28023 1726853613.85698: attempt loop complete, returning result 28023 1726853613.85701: _execute() done 28023 1726853613.85703: dumping result to json 28023 1726853613.85710: done dumping result, returning 28023 1726853613.85718: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [02083763-bbaf-fdb6-dad7-0000000001d4] 28023 1726853613.85726: sending task result for task 02083763-bbaf-fdb6-dad7-0000000001d4 28023 1726853613.85818: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000001d4 28023 1726853613.85821: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003459", "end": "2024-09-20 13:33:33.813355", "rc": 0, "start": "2024-09-20 13:33:33.809896" } STDOUT: bonding_masters eth0 lo rpltstbr 28023 1726853613.85894: no more pending results, returning what we have 28023 1726853613.85898: results queue empty 28023 1726853613.85899: checking for any_errors_fatal 28023 1726853613.85900: done checking for any_errors_fatal 28023 1726853613.85900: checking for max_fail_percentage 28023 1726853613.85902: done checking for max_fail_percentage 28023 1726853613.85903: checking to see if all hosts have failed and the running result is not ok 28023 1726853613.85904: done checking to see if all hosts have failed 28023 1726853613.85904: getting the remaining hosts for this loop 28023 1726853613.85906: done getting the remaining hosts for this loop 28023 1726853613.85909: getting the next task for host managed_node3 28023 1726853613.85915: done getting next task for host managed_node3 28023 1726853613.85917: ^ task is: TASK: Set current_interfaces 28023 1726853613.85922: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853613.85925: getting variables 28023 1726853613.85926: in VariableManager get_vars() 28023 1726853613.85968: Calling all_inventory to load vars for managed_node3 28023 1726853613.85974: Calling groups_inventory to load vars for managed_node3 28023 1726853613.85976: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853613.85987: Calling all_plugins_play to load vars for managed_node3 28023 1726853613.85990: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853613.85992: Calling groups_plugins_play to load vars for managed_node3 28023 1726853613.86147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853613.86279: done with get_vars() 28023 1726853613.86287: done getting variables 28023 1726853613.86329: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:33:33 -0400 (0:00:00.335) 0:00:05.947 ****** 28023 1726853613.86352: entering _queue_task() for managed_node3/set_fact 28023 1726853613.86552: worker is 1 (out of 1 available) 28023 1726853613.86565: exiting _queue_task() for managed_node3/set_fact 28023 1726853613.86581: done queuing things up, now waiting for results queue to drain 28023 1726853613.86583: waiting for pending results... 28023 1726853613.86736: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 28023 1726853613.86812: in run() - task 02083763-bbaf-fdb6-dad7-0000000001d5 28023 1726853613.86822: variable 'ansible_search_path' from source: unknown 28023 1726853613.86825: variable 'ansible_search_path' from source: unknown 28023 1726853613.86852: calling self._execute() 28023 1726853613.86918: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853613.86923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853613.86932: variable 'omit' from source: magic vars 28023 1726853613.87248: variable 'ansible_distribution_major_version' from source: facts 28023 1726853613.87262: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853613.87268: variable 'omit' from source: magic vars 28023 1726853613.87303: variable 'omit' from source: magic vars 28023 1726853613.87380: variable '_current_interfaces' from source: set_fact 28023 1726853613.87425: variable 'omit' from source: magic vars 28023 1726853613.87458: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853613.87491: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853613.87506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853613.87519: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853613.87528: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853613.87550: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853613.87552: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853613.87555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853613.87624: Set connection var ansible_shell_type to sh 28023 1726853613.87630: Set connection var ansible_shell_executable to /bin/sh 28023 1726853613.87635: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853613.87641: Set connection var ansible_connection to ssh 28023 1726853613.87646: Set connection var ansible_pipelining to False 28023 1726853613.87650: Set connection var ansible_timeout to 10 28023 1726853613.87674: variable 'ansible_shell_executable' from source: unknown 28023 1726853613.87678: variable 'ansible_connection' from source: unknown 28023 1726853613.87682: variable 'ansible_module_compression' from source: unknown 28023 1726853613.87684: variable 'ansible_shell_type' from source: unknown 28023 1726853613.87687: variable 'ansible_shell_executable' from source: unknown 28023 1726853613.87690: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853613.87692: variable 'ansible_pipelining' from source: unknown 28023 1726853613.87695: variable 'ansible_timeout' from source: unknown 28023 1726853613.87696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853613.87793: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853613.87801: variable 'omit' from source: magic vars 28023 1726853613.87815: starting attempt loop 28023 1726853613.87819: running the handler 28023 1726853613.87821: handler run complete 28023 1726853613.87828: attempt loop complete, returning result 28023 1726853613.87831: _execute() done 28023 1726853613.87834: dumping result to json 28023 1726853613.87837: done dumping result, returning 28023 1726853613.87845: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [02083763-bbaf-fdb6-dad7-0000000001d5] 28023 1726853613.87847: sending task result for task 02083763-bbaf-fdb6-dad7-0000000001d5 ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 28023 1726853613.87983: no more pending results, returning what we have 28023 1726853613.87986: results queue empty 28023 1726853613.87987: checking for any_errors_fatal 28023 1726853613.87995: done checking for any_errors_fatal 28023 1726853613.87995: checking for max_fail_percentage 28023 1726853613.87997: done checking for max_fail_percentage 28023 1726853613.87997: checking to see if all hosts have failed and the running result is not ok 28023 1726853613.87998: done checking to see if all hosts have failed 28023 1726853613.87999: getting the remaining hosts for this loop 28023 1726853613.88001: done getting the remaining hosts for this loop 28023 1726853613.88004: getting the next task for host managed_node3 28023 1726853613.88011: done getting next task for host managed_node3 28023 1726853613.88013: ^ task is: TASK: Show current_interfaces 28023 1726853613.88017: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853613.88021: getting variables 28023 1726853613.88022: in VariableManager get_vars() 28023 1726853613.88063: Calling all_inventory to load vars for managed_node3 28023 1726853613.88065: Calling groups_inventory to load vars for managed_node3 28023 1726853613.88067: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853613.88074: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000001d5 28023 1726853613.88076: WORKER PROCESS EXITING 28023 1726853613.88084: Calling all_plugins_play to load vars for managed_node3 28023 1726853613.88087: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853613.88089: Calling groups_plugins_play to load vars for managed_node3 28023 1726853613.88255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853613.88383: done with get_vars() 28023 1726853613.88390: done getting variables 28023 1726853613.88427: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:33:33 -0400 (0:00:00.020) 0:00:05.968 ****** 28023 1726853613.88446: entering _queue_task() for managed_node3/debug 28023 1726853613.88631: worker is 1 (out of 1 available) 28023 1726853613.88644: exiting _queue_task() for managed_node3/debug 28023 1726853613.88656: done queuing things up, now waiting for results queue to drain 28023 1726853613.88657: waiting for pending results... 28023 1726853613.88806: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 28023 1726853613.88867: in run() - task 02083763-bbaf-fdb6-dad7-00000000019e 28023 1726853613.88882: variable 'ansible_search_path' from source: unknown 28023 1726853613.88885: variable 'ansible_search_path' from source: unknown 28023 1726853613.88912: calling self._execute() 28023 1726853613.88978: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853613.88982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853613.88990: variable 'omit' from source: magic vars 28023 1726853613.89245: variable 'ansible_distribution_major_version' from source: facts 28023 1726853613.89255: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853613.89263: variable 'omit' from source: magic vars 28023 1726853613.89293: variable 'omit' from source: magic vars 28023 1726853613.89361: variable 'current_interfaces' from source: set_fact 28023 1726853613.89389: variable 'omit' from source: magic vars 28023 1726853613.89420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853613.89449: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853613.89467: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853613.89487: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853613.89495: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853613.89517: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853613.89520: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853613.89523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853613.89593: Set connection var ansible_shell_type to sh 28023 1726853613.89600: Set connection var ansible_shell_executable to /bin/sh 28023 1726853613.89605: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853613.89610: Set connection var ansible_connection to ssh 28023 1726853613.89615: Set connection var ansible_pipelining to False 28023 1726853613.89620: Set connection var ansible_timeout to 10 28023 1726853613.89639: variable 'ansible_shell_executable' from source: unknown 28023 1726853613.89642: variable 'ansible_connection' from source: unknown 28023 1726853613.89646: variable 'ansible_module_compression' from source: unknown 28023 1726853613.89649: variable 'ansible_shell_type' from source: unknown 28023 1726853613.89651: variable 'ansible_shell_executable' from source: unknown 28023 1726853613.89653: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853613.89655: variable 'ansible_pipelining' from source: unknown 28023 1726853613.89660: variable 'ansible_timeout' from source: unknown 28023 1726853613.89672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853613.89768: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853613.89776: variable 'omit' from source: magic vars 28023 1726853613.89782: starting attempt loop 28023 1726853613.89784: running the handler 28023 1726853613.89821: handler run complete 28023 1726853613.89832: attempt loop complete, returning result 28023 1726853613.89835: _execute() done 28023 1726853613.89837: dumping result to json 28023 1726853613.89840: done dumping result, returning 28023 1726853613.89846: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [02083763-bbaf-fdb6-dad7-00000000019e] 28023 1726853613.89849: sending task result for task 02083763-bbaf-fdb6-dad7-00000000019e 28023 1726853613.89925: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000019e 28023 1726853613.89927: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 28023 1726853613.89969: no more pending results, returning what we have 28023 1726853613.89974: results queue empty 28023 1726853613.89975: checking for any_errors_fatal 28023 1726853613.89980: done checking for any_errors_fatal 28023 1726853613.89981: checking for max_fail_percentage 28023 1726853613.89982: done checking for max_fail_percentage 28023 1726853613.89983: checking to see if all hosts have failed and the running result is not ok 28023 1726853613.89984: done checking to see if all hosts have failed 28023 1726853613.89985: getting the remaining hosts for this loop 28023 1726853613.89986: done getting the remaining hosts for this loop 28023 1726853613.89990: getting the next task for host managed_node3 28023 1726853613.89997: done getting next task for host managed_node3 28023 1726853613.89999: ^ task is: TASK: Install iproute 28023 1726853613.90002: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853613.90006: getting variables 28023 1726853613.90007: in VariableManager get_vars() 28023 1726853613.90040: Calling all_inventory to load vars for managed_node3 28023 1726853613.90042: Calling groups_inventory to load vars for managed_node3 28023 1726853613.90044: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853613.90052: Calling all_plugins_play to load vars for managed_node3 28023 1726853613.90054: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853613.90057: Calling groups_plugins_play to load vars for managed_node3 28023 1726853613.90185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853613.90311: done with get_vars() 28023 1726853613.90318: done getting variables 28023 1726853613.90354: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 13:33:33 -0400 (0:00:00.019) 0:00:05.987 ****** 28023 1726853613.90376: entering _queue_task() for managed_node3/package 28023 1726853613.90551: worker is 1 (out of 1 available) 28023 1726853613.90565: exiting _queue_task() for managed_node3/package 28023 1726853613.90579: done queuing things up, now waiting for results queue to drain 28023 1726853613.90580: waiting for pending results... 28023 1726853613.90724: running TaskExecutor() for managed_node3/TASK: Install iproute 28023 1726853613.90778: in run() - task 02083763-bbaf-fdb6-dad7-00000000016d 28023 1726853613.90789: variable 'ansible_search_path' from source: unknown 28023 1726853613.90793: variable 'ansible_search_path' from source: unknown 28023 1726853613.90823: calling self._execute() 28023 1726853613.90890: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853613.90894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853613.90901: variable 'omit' from source: magic vars 28023 1726853613.91212: variable 'ansible_distribution_major_version' from source: facts 28023 1726853613.91222: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853613.91227: variable 'omit' from source: magic vars 28023 1726853613.91258: variable 'omit' from source: magic vars 28023 1726853613.91382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853613.92721: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853613.92773: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853613.92800: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853613.92823: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853613.92844: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853613.92911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853613.92930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853613.92947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853613.92975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853613.92988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853613.93060: variable '__network_is_ostree' from source: set_fact 28023 1726853613.93064: variable 'omit' from source: magic vars 28023 1726853613.93096: variable 'omit' from source: magic vars 28023 1726853613.93110: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853613.93130: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853613.93145: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853613.93159: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853613.93166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853613.93189: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853613.93192: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853613.93196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853613.93260: Set connection var ansible_shell_type to sh 28023 1726853613.93263: Set connection var ansible_shell_executable to /bin/sh 28023 1726853613.93269: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853613.93275: Set connection var ansible_connection to ssh 28023 1726853613.93280: Set connection var ansible_pipelining to False 28023 1726853613.93285: Set connection var ansible_timeout to 10 28023 1726853613.93307: variable 'ansible_shell_executable' from source: unknown 28023 1726853613.93310: variable 'ansible_connection' from source: unknown 28023 1726853613.93312: variable 'ansible_module_compression' from source: unknown 28023 1726853613.93314: variable 'ansible_shell_type' from source: unknown 28023 1726853613.93316: variable 'ansible_shell_executable' from source: unknown 28023 1726853613.93319: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853613.93321: variable 'ansible_pipelining' from source: unknown 28023 1726853613.93323: variable 'ansible_timeout' from source: unknown 28023 1726853613.93332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853613.93395: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853613.93403: variable 'omit' from source: magic vars 28023 1726853613.93408: starting attempt loop 28023 1726853613.93411: running the handler 28023 1726853613.93417: variable 'ansible_facts' from source: unknown 28023 1726853613.93419: variable 'ansible_facts' from source: unknown 28023 1726853613.93448: _low_level_execute_command(): starting 28023 1726853613.93456: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853613.93949: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853613.93953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853613.93956: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853613.93958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853613.93961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853613.94016: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853613.94019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853613.94021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853613.94094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853613.95797: stdout chunk (state=3): >>>/root <<< 28023 1726853613.95892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853613.95923: stderr chunk (state=3): >>><<< 28023 1726853613.95926: stdout chunk (state=3): >>><<< 28023 1726853613.95948: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853613.95961: _low_level_execute_command(): starting 28023 1726853613.95966: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853613.959474-28342-93610673758539 `" && echo ansible-tmp-1726853613.959474-28342-93610673758539="` echo /root/.ansible/tmp/ansible-tmp-1726853613.959474-28342-93610673758539 `" ) && sleep 0' 28023 1726853613.96404: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853613.96407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853613.96409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 28023 1726853613.96411: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853613.96413: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853613.96469: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853613.96474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853613.96531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853613.98466: stdout chunk (state=3): >>>ansible-tmp-1726853613.959474-28342-93610673758539=/root/.ansible/tmp/ansible-tmp-1726853613.959474-28342-93610673758539 <<< 28023 1726853613.98574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853613.98599: stderr chunk (state=3): >>><<< 28023 1726853613.98606: stdout chunk (state=3): >>><<< 28023 1726853613.98619: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853613.959474-28342-93610673758539=/root/.ansible/tmp/ansible-tmp-1726853613.959474-28342-93610673758539 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853613.98646: variable 'ansible_module_compression' from source: unknown 28023 1726853613.98695: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 28023 1726853613.98698: ANSIBALLZ: Acquiring lock 28023 1726853613.98702: ANSIBALLZ: Lock acquired: 139729396667488 28023 1726853613.98704: ANSIBALLZ: Creating module 28023 1726853614.08566: ANSIBALLZ: Writing module into payload 28023 1726853614.08704: ANSIBALLZ: Writing module 28023 1726853614.08721: ANSIBALLZ: Renaming module 28023 1726853614.08732: ANSIBALLZ: Done creating module 28023 1726853614.08749: variable 'ansible_facts' from source: unknown 28023 1726853614.08820: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853613.959474-28342-93610673758539/AnsiballZ_dnf.py 28023 1726853614.08922: Sending initial data 28023 1726853614.08926: Sent initial data (150 bytes) 28023 1726853614.09382: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853614.09385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853614.09389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853614.09392: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853614.09394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853614.09449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853614.09457: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853614.09459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853614.09525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853614.11204: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853614.11264: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853614.11318: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpoi8do263 /root/.ansible/tmp/ansible-tmp-1726853613.959474-28342-93610673758539/AnsiballZ_dnf.py <<< 28023 1726853614.11323: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853613.959474-28342-93610673758539/AnsiballZ_dnf.py" <<< 28023 1726853614.11381: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpoi8do263" to remote "/root/.ansible/tmp/ansible-tmp-1726853613.959474-28342-93610673758539/AnsiballZ_dnf.py" <<< 28023 1726853614.11383: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853613.959474-28342-93610673758539/AnsiballZ_dnf.py" <<< 28023 1726853614.12121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853614.12165: stderr chunk (state=3): >>><<< 28023 1726853614.12168: stdout chunk (state=3): >>><<< 28023 1726853614.12213: done transferring module to remote 28023 1726853614.12221: _low_level_execute_command(): starting 28023 1726853614.12226: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853613.959474-28342-93610673758539/ /root/.ansible/tmp/ansible-tmp-1726853613.959474-28342-93610673758539/AnsiballZ_dnf.py && sleep 0' 28023 1726853614.12724: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853614.12727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853614.12729: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853614.12731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853614.12733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853614.12787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853614.12790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853614.12849: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853614.14833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853614.14837: stdout chunk (state=3): >>><<< 28023 1726853614.14839: stderr chunk (state=3): >>><<< 28023 1726853614.14841: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853614.14843: _low_level_execute_command(): starting 28023 1726853614.14845: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853613.959474-28342-93610673758539/AnsiballZ_dnf.py && sleep 0' 28023 1726853614.15490: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853614.15520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853614.15537: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853614.15558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853614.15652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853614.58557: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 28023 1726853614.63327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853614.63360: stderr chunk (state=3): >>><<< 28023 1726853614.63363: stdout chunk (state=3): >>><<< 28023 1726853614.63376: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853614.63410: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853613.959474-28342-93610673758539/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853614.63416: _low_level_execute_command(): starting 28023 1726853614.63421: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853613.959474-28342-93610673758539/ > /dev/null 2>&1 && sleep 0' 28023 1726853614.63886: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853614.63889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853614.63893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853614.63895: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853614.63898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853614.63900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853614.63948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853614.63951: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853614.63953: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853614.64019: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853614.65896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853614.65922: stderr chunk (state=3): >>><<< 28023 1726853614.65925: stdout chunk (state=3): >>><<< 28023 1726853614.65939: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853614.65946: handler run complete 28023 1726853614.66066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853614.66305: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853614.66308: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853614.66311: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853614.66329: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853614.66401: variable '__install_status' from source: unknown 28023 1726853614.66419: Evaluated conditional (__install_status is success): True 28023 1726853614.66434: attempt loop complete, returning result 28023 1726853614.66437: _execute() done 28023 1726853614.66439: dumping result to json 28023 1726853614.66447: done dumping result, returning 28023 1726853614.66455: done running TaskExecutor() for managed_node3/TASK: Install iproute [02083763-bbaf-fdb6-dad7-00000000016d] 28023 1726853614.66461: sending task result for task 02083763-bbaf-fdb6-dad7-00000000016d 28023 1726853614.66561: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000016d 28023 1726853614.66564: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 28023 1726853614.66779: no more pending results, returning what we have 28023 1726853614.66783: results queue empty 28023 1726853614.66784: checking for any_errors_fatal 28023 1726853614.66789: done checking for any_errors_fatal 28023 1726853614.66790: checking for max_fail_percentage 28023 1726853614.66791: done checking for max_fail_percentage 28023 1726853614.66792: checking to see if all hosts have failed and the running result is not ok 28023 1726853614.66793: done checking to see if all hosts have failed 28023 1726853614.66793: getting the remaining hosts for this loop 28023 1726853614.66795: done getting the remaining hosts for this loop 28023 1726853614.66797: getting the next task for host managed_node3 28023 1726853614.66803: done getting next task for host managed_node3 28023 1726853614.66805: ^ task is: TASK: Create veth interface {{ interface }} 28023 1726853614.66807: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853614.66810: getting variables 28023 1726853614.66811: in VariableManager get_vars() 28023 1726853614.66924: Calling all_inventory to load vars for managed_node3 28023 1726853614.66927: Calling groups_inventory to load vars for managed_node3 28023 1726853614.66930: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853614.66939: Calling all_plugins_play to load vars for managed_node3 28023 1726853614.66942: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853614.66944: Calling groups_plugins_play to load vars for managed_node3 28023 1726853614.67118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853614.67332: done with get_vars() 28023 1726853614.67342: done getting variables 28023 1726853614.67407: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853614.67519: variable 'interface' from source: set_fact TASK [Create veth interface ethtest0] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 13:33:34 -0400 (0:00:00.771) 0:00:06.759 ****** 28023 1726853614.67551: entering _queue_task() for managed_node3/command 28023 1726853614.67760: worker is 1 (out of 1 available) 28023 1726853614.67775: exiting _queue_task() for managed_node3/command 28023 1726853614.67788: done queuing things up, now waiting for results queue to drain 28023 1726853614.67789: waiting for pending results... 28023 1726853614.67951: running TaskExecutor() for managed_node3/TASK: Create veth interface ethtest0 28023 1726853614.68025: in run() - task 02083763-bbaf-fdb6-dad7-00000000016e 28023 1726853614.68037: variable 'ansible_search_path' from source: unknown 28023 1726853614.68040: variable 'ansible_search_path' from source: unknown 28023 1726853614.68240: variable 'interface' from source: set_fact 28023 1726853614.68299: variable 'interface' from source: set_fact 28023 1726853614.68352: variable 'interface' from source: set_fact 28023 1726853614.68462: Loaded config def from plugin (lookup/items) 28023 1726853614.68468: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 28023 1726853614.68488: variable 'omit' from source: magic vars 28023 1726853614.68574: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853614.68581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853614.68590: variable 'omit' from source: magic vars 28023 1726853614.68740: variable 'ansible_distribution_major_version' from source: facts 28023 1726853614.68746: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853614.68877: variable 'type' from source: set_fact 28023 1726853614.68881: variable 'state' from source: include params 28023 1726853614.68884: variable 'interface' from source: set_fact 28023 1726853614.68887: variable 'current_interfaces' from source: set_fact 28023 1726853614.68893: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 28023 1726853614.68899: variable 'omit' from source: magic vars 28023 1726853614.68922: variable 'omit' from source: magic vars 28023 1726853614.68949: variable 'item' from source: unknown 28023 1726853614.69004: variable 'item' from source: unknown 28023 1726853614.69015: variable 'omit' from source: magic vars 28023 1726853614.69039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853614.69064: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853614.69081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853614.69096: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853614.69104: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853614.69127: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853614.69130: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853614.69132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853614.69202: Set connection var ansible_shell_type to sh 28023 1726853614.69205: Set connection var ansible_shell_executable to /bin/sh 28023 1726853614.69210: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853614.69215: Set connection var ansible_connection to ssh 28023 1726853614.69220: Set connection var ansible_pipelining to False 28023 1726853614.69225: Set connection var ansible_timeout to 10 28023 1726853614.69243: variable 'ansible_shell_executable' from source: unknown 28023 1726853614.69246: variable 'ansible_connection' from source: unknown 28023 1726853614.69248: variable 'ansible_module_compression' from source: unknown 28023 1726853614.69251: variable 'ansible_shell_type' from source: unknown 28023 1726853614.69253: variable 'ansible_shell_executable' from source: unknown 28023 1726853614.69255: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853614.69261: variable 'ansible_pipelining' from source: unknown 28023 1726853614.69263: variable 'ansible_timeout' from source: unknown 28023 1726853614.69267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853614.69567: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853614.69573: variable 'omit' from source: magic vars 28023 1726853614.69576: starting attempt loop 28023 1726853614.69579: running the handler 28023 1726853614.69581: _low_level_execute_command(): starting 28023 1726853614.69583: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853614.70132: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853614.70149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853614.70168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853614.70189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853614.70205: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853614.70215: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853614.70226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853614.70245: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28023 1726853614.70258: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 28023 1726853614.70268: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28023 1726853614.70289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853614.70374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853614.70460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853614.70517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853614.72390: stdout chunk (state=3): >>>/root <<< 28023 1726853614.72462: stdout chunk (state=3): >>><<< 28023 1726853614.72465: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853614.72468: stderr chunk (state=3): >>><<< 28023 1726853614.72489: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853614.72591: _low_level_execute_command(): starting 28023 1726853614.72599: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853614.72504-28369-102182338393940 `" && echo ansible-tmp-1726853614.72504-28369-102182338393940="` echo /root/.ansible/tmp/ansible-tmp-1726853614.72504-28369-102182338393940 `" ) && sleep 0' 28023 1726853614.73112: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853614.73121: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853614.73132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853614.73146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853614.73353: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853614.73357: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853614.73360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853614.73363: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28023 1726853614.73366: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 28023 1726853614.73369: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28023 1726853614.73374: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853614.73376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853614.73381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853614.73383: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853614.73405: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853614.73408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853614.73428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853614.75432: stdout chunk (state=3): >>>ansible-tmp-1726853614.72504-28369-102182338393940=/root/.ansible/tmp/ansible-tmp-1726853614.72504-28369-102182338393940 <<< 28023 1726853614.75589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853614.75593: stdout chunk (state=3): >>><<< 28023 1726853614.75595: stderr chunk (state=3): >>><<< 28023 1726853614.75777: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853614.72504-28369-102182338393940=/root/.ansible/tmp/ansible-tmp-1726853614.72504-28369-102182338393940 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853614.75782: variable 'ansible_module_compression' from source: unknown 28023 1726853614.75784: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28023 1726853614.75786: variable 'ansible_facts' from source: unknown 28023 1726853614.75850: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853614.72504-28369-102182338393940/AnsiballZ_command.py 28023 1726853614.76031: Sending initial data 28023 1726853614.76041: Sent initial data (154 bytes) 28023 1726853614.76626: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853614.76642: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853614.76684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853614.76701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853614.76716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853614.76801: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853614.76820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853614.76914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853614.78587: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853614.78678: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853614.78757: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpw7wrjpk3 /root/.ansible/tmp/ansible-tmp-1726853614.72504-28369-102182338393940/AnsiballZ_command.py <<< 28023 1726853614.78761: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853614.72504-28369-102182338393940/AnsiballZ_command.py" <<< 28023 1726853614.78806: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpw7wrjpk3" to remote "/root/.ansible/tmp/ansible-tmp-1726853614.72504-28369-102182338393940/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853614.72504-28369-102182338393940/AnsiballZ_command.py" <<< 28023 1726853614.79634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853614.79672: stderr chunk (state=3): >>><<< 28023 1726853614.79683: stdout chunk (state=3): >>><<< 28023 1726853614.79751: done transferring module to remote 28023 1726853614.79836: _low_level_execute_command(): starting 28023 1726853614.79839: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853614.72504-28369-102182338393940/ /root/.ansible/tmp/ansible-tmp-1726853614.72504-28369-102182338393940/AnsiballZ_command.py && sleep 0' 28023 1726853614.80449: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853614.80469: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853614.80503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853614.80611: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853614.80640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853614.80745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853614.82842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853614.82858: stderr chunk (state=3): >>><<< 28023 1726853614.82874: stdout chunk (state=3): >>><<< 28023 1726853614.82898: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853614.82980: _low_level_execute_command(): starting 28023 1726853614.82984: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853614.72504-28369-102182338393940/AnsiballZ_command.py && sleep 0' 28023 1726853614.83542: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853614.83555: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853614.83579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853614.83641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853614.83702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853614.83720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853614.83752: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853614.83859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853615.00326: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-20 13:33:34.991817", "end": "2024-09-20 13:33:34.997674", "delta": "0:00:00.005857", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28023 1726853615.02647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853615.02673: stderr chunk (state=3): >>><<< 28023 1726853615.02676: stdout chunk (state=3): >>><<< 28023 1726853615.02693: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-20 13:33:34.991817", "end": "2024-09-20 13:33:34.997674", "delta": "0:00:00.005857", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853615.02725: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest0 type veth peer name peerethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853614.72504-28369-102182338393940/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853615.02731: _low_level_execute_command(): starting 28023 1726853615.02736: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853614.72504-28369-102182338393940/ > /dev/null 2>&1 && sleep 0' 28023 1726853615.03184: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853615.03188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853615.03190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853615.03192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853615.03237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853615.03252: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853615.03319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853615.08229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853615.08257: stderr chunk (state=3): >>><<< 28023 1726853615.08261: stdout chunk (state=3): >>><<< 28023 1726853615.08275: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853615.08281: handler run complete 28023 1726853615.08298: Evaluated conditional (False): False 28023 1726853615.08308: attempt loop complete, returning result 28023 1726853615.08323: variable 'item' from source: unknown 28023 1726853615.08387: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link add ethtest0 type veth peer name peerethtest0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0" ], "delta": "0:00:00.005857", "end": "2024-09-20 13:33:34.997674", "item": "ip link add ethtest0 type veth peer name peerethtest0", "rc": 0, "start": "2024-09-20 13:33:34.991817" } 28023 1726853615.08559: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853615.08562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853615.08564: variable 'omit' from source: magic vars 28023 1726853615.08626: variable 'ansible_distribution_major_version' from source: facts 28023 1726853615.08629: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853615.08746: variable 'type' from source: set_fact 28023 1726853615.08749: variable 'state' from source: include params 28023 1726853615.08752: variable 'interface' from source: set_fact 28023 1726853615.08759: variable 'current_interfaces' from source: set_fact 28023 1726853615.08762: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 28023 1726853615.08767: variable 'omit' from source: magic vars 28023 1726853615.08781: variable 'omit' from source: magic vars 28023 1726853615.08810: variable 'item' from source: unknown 28023 1726853615.08851: variable 'item' from source: unknown 28023 1726853615.08863: variable 'omit' from source: magic vars 28023 1726853615.08881: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853615.08888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853615.08900: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853615.08909: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853615.08912: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853615.08914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853615.08962: Set connection var ansible_shell_type to sh 28023 1726853615.08966: Set connection var ansible_shell_executable to /bin/sh 28023 1726853615.08973: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853615.08979: Set connection var ansible_connection to ssh 28023 1726853615.08983: Set connection var ansible_pipelining to False 28023 1726853615.08988: Set connection var ansible_timeout to 10 28023 1726853615.09009: variable 'ansible_shell_executable' from source: unknown 28023 1726853615.09012: variable 'ansible_connection' from source: unknown 28023 1726853615.09014: variable 'ansible_module_compression' from source: unknown 28023 1726853615.09016: variable 'ansible_shell_type' from source: unknown 28023 1726853615.09018: variable 'ansible_shell_executable' from source: unknown 28023 1726853615.09020: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853615.09023: variable 'ansible_pipelining' from source: unknown 28023 1726853615.09025: variable 'ansible_timeout' from source: unknown 28023 1726853615.09029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853615.09094: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853615.09102: variable 'omit' from source: magic vars 28023 1726853615.09105: starting attempt loop 28023 1726853615.09110: running the handler 28023 1726853615.09119: _low_level_execute_command(): starting 28023 1726853615.09122: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853615.09791: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853615.09816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853615.09828: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853615.09840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853615.09913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853615.11623: stdout chunk (state=3): >>>/root <<< 28023 1726853615.11788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853615.11791: stdout chunk (state=3): >>><<< 28023 1726853615.11793: stderr chunk (state=3): >>><<< 28023 1726853615.11978: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853615.11982: _low_level_execute_command(): starting 28023 1726853615.11985: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853615.117991-28369-65246048366831 `" && echo ansible-tmp-1726853615.117991-28369-65246048366831="` echo /root/.ansible/tmp/ansible-tmp-1726853615.117991-28369-65246048366831 `" ) && sleep 0' 28023 1726853615.12452: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853615.12464: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853615.12477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853615.12492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853615.12504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853615.12512: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853615.12526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853615.12540: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28023 1726853615.12547: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 28023 1726853615.12554: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28023 1726853615.12565: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853615.12635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853615.12657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853615.12674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853615.12689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853615.12774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853615.14763: stdout chunk (state=3): >>>ansible-tmp-1726853615.117991-28369-65246048366831=/root/.ansible/tmp/ansible-tmp-1726853615.117991-28369-65246048366831 <<< 28023 1726853615.14894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853615.15013: stderr chunk (state=3): >>><<< 28023 1726853615.15016: stdout chunk (state=3): >>><<< 28023 1726853615.15019: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853615.117991-28369-65246048366831=/root/.ansible/tmp/ansible-tmp-1726853615.117991-28369-65246048366831 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853615.15021: variable 'ansible_module_compression' from source: unknown 28023 1726853615.15051: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28023 1726853615.15080: variable 'ansible_facts' from source: unknown 28023 1726853615.15165: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853615.117991-28369-65246048366831/AnsiballZ_command.py 28023 1726853615.15382: Sending initial data 28023 1726853615.15385: Sent initial data (154 bytes) 28023 1726853615.15986: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853615.16053: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853615.16116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853615.16137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853615.16168: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853615.16248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853615.17872: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853615.17941: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853615.18017: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpwdbimntw /root/.ansible/tmp/ansible-tmp-1726853615.117991-28369-65246048366831/AnsiballZ_command.py <<< 28023 1726853615.18021: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853615.117991-28369-65246048366831/AnsiballZ_command.py" <<< 28023 1726853615.18128: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpwdbimntw" to remote "/root/.ansible/tmp/ansible-tmp-1726853615.117991-28369-65246048366831/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853615.117991-28369-65246048366831/AnsiballZ_command.py" <<< 28023 1726853615.19230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853615.19233: stdout chunk (state=3): >>><<< 28023 1726853615.19236: stderr chunk (state=3): >>><<< 28023 1726853615.19238: done transferring module to remote 28023 1726853615.19240: _low_level_execute_command(): starting 28023 1726853615.19242: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853615.117991-28369-65246048366831/ /root/.ansible/tmp/ansible-tmp-1726853615.117991-28369-65246048366831/AnsiballZ_command.py && sleep 0' 28023 1726853615.19975: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853615.19982: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853615.20018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853615.20021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853615.20024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853615.20026: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853615.20028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853615.20079: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853615.20089: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853615.20143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853615.22518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853615.22521: stdout chunk (state=3): >>><<< 28023 1726853615.22524: stderr chunk (state=3): >>><<< 28023 1726853615.22527: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853615.22529: _low_level_execute_command(): starting 28023 1726853615.22531: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853615.117991-28369-65246048366831/AnsiballZ_command.py && sleep 0' 28023 1726853615.23024: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853615.23031: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853615.23042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853615.23058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853615.23067: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853615.23076: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853615.23162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853615.23196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853615.23291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853615.39198: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-20 13:33:35.386621", "end": "2024-09-20 13:33:35.390685", "delta": "0:00:00.004064", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28023 1726853615.40813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853615.40827: stdout chunk (state=3): >>><<< 28023 1726853615.40841: stderr chunk (state=3): >>><<< 28023 1726853615.40870: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-20 13:33:35.386621", "end": "2024-09-20 13:33:35.390685", "delta": "0:00:00.004064", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853615.40915: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853615.117991-28369-65246048366831/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853615.40933: _low_level_execute_command(): starting 28023 1726853615.40943: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853615.117991-28369-65246048366831/ > /dev/null 2>&1 && sleep 0' 28023 1726853615.41620: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853615.41640: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853615.41659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853615.41679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853615.41698: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853615.41742: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853615.41816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853615.41846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853615.41873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853615.41972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853615.43893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853615.43910: stdout chunk (state=3): >>><<< 28023 1726853615.44078: stderr chunk (state=3): >>><<< 28023 1726853615.44083: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853615.44086: handler run complete 28023 1726853615.44088: Evaluated conditional (False): False 28023 1726853615.44090: attempt loop complete, returning result 28023 1726853615.44092: variable 'item' from source: unknown 28023 1726853615.44105: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set peerethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest0", "up" ], "delta": "0:00:00.004064", "end": "2024-09-20 13:33:35.390685", "item": "ip link set peerethtest0 up", "rc": 0, "start": "2024-09-20 13:33:35.386621" } 28023 1726853615.44331: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853615.44344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853615.44359: variable 'omit' from source: magic vars 28023 1726853615.44545: variable 'ansible_distribution_major_version' from source: facts 28023 1726853615.44576: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853615.44770: variable 'type' from source: set_fact 28023 1726853615.44841: variable 'state' from source: include params 28023 1726853615.44844: variable 'interface' from source: set_fact 28023 1726853615.44846: variable 'current_interfaces' from source: set_fact 28023 1726853615.44848: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 28023 1726853615.44850: variable 'omit' from source: magic vars 28023 1726853615.44860: variable 'omit' from source: magic vars 28023 1726853615.44885: variable 'item' from source: unknown 28023 1726853615.44954: variable 'item' from source: unknown 28023 1726853615.44982: variable 'omit' from source: magic vars 28023 1726853615.45007: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853615.45020: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853615.45031: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853615.45061: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853615.45064: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853615.45167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853615.45172: Set connection var ansible_shell_type to sh 28023 1726853615.45175: Set connection var ansible_shell_executable to /bin/sh 28023 1726853615.45189: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853615.45201: Set connection var ansible_connection to ssh 28023 1726853615.45211: Set connection var ansible_pipelining to False 28023 1726853615.45221: Set connection var ansible_timeout to 10 28023 1726853615.45249: variable 'ansible_shell_executable' from source: unknown 28023 1726853615.45261: variable 'ansible_connection' from source: unknown 28023 1726853615.45277: variable 'ansible_module_compression' from source: unknown 28023 1726853615.45288: variable 'ansible_shell_type' from source: unknown 28023 1726853615.45376: variable 'ansible_shell_executable' from source: unknown 28023 1726853615.45381: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853615.45383: variable 'ansible_pipelining' from source: unknown 28023 1726853615.45385: variable 'ansible_timeout' from source: unknown 28023 1726853615.45387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853615.45438: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853615.45504: variable 'omit' from source: magic vars 28023 1726853615.45510: starting attempt loop 28023 1726853615.45512: running the handler 28023 1726853615.45514: _low_level_execute_command(): starting 28023 1726853615.45517: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853615.46282: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853615.46286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853615.46311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853615.46329: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853615.46349: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853615.46449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853615.48152: stdout chunk (state=3): >>>/root <<< 28023 1726853615.48306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853615.48336: stdout chunk (state=3): >>><<< 28023 1726853615.48352: stderr chunk (state=3): >>><<< 28023 1726853615.48533: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853615.48537: _low_level_execute_command(): starting 28023 1726853615.48539: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853615.4839785-28369-233505722133735 `" && echo ansible-tmp-1726853615.4839785-28369-233505722133735="` echo /root/.ansible/tmp/ansible-tmp-1726853615.4839785-28369-233505722133735 `" ) && sleep 0' 28023 1726853615.49870: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853615.50008: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853615.50074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853615.52095: stdout chunk (state=3): >>>ansible-tmp-1726853615.4839785-28369-233505722133735=/root/.ansible/tmp/ansible-tmp-1726853615.4839785-28369-233505722133735 <<< 28023 1726853615.52312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853615.52315: stdout chunk (state=3): >>><<< 28023 1726853615.52318: stderr chunk (state=3): >>><<< 28023 1726853615.52585: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853615.4839785-28369-233505722133735=/root/.ansible/tmp/ansible-tmp-1726853615.4839785-28369-233505722133735 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853615.52588: variable 'ansible_module_compression' from source: unknown 28023 1726853615.52590: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28023 1726853615.52592: variable 'ansible_facts' from source: unknown 28023 1726853615.52594: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853615.4839785-28369-233505722133735/AnsiballZ_command.py 28023 1726853615.52732: Sending initial data 28023 1726853615.52783: Sent initial data (156 bytes) 28023 1726853615.53426: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853615.53442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853615.53452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853615.53499: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853615.53512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853615.53600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853615.55266: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853615.55326: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853615.55382: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpdzq290ra /root/.ansible/tmp/ansible-tmp-1726853615.4839785-28369-233505722133735/AnsiballZ_command.py <<< 28023 1726853615.55390: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853615.4839785-28369-233505722133735/AnsiballZ_command.py" <<< 28023 1726853615.55447: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpdzq290ra" to remote "/root/.ansible/tmp/ansible-tmp-1726853615.4839785-28369-233505722133735/AnsiballZ_command.py" <<< 28023 1726853615.55451: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853615.4839785-28369-233505722133735/AnsiballZ_command.py" <<< 28023 1726853615.56040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853615.56086: stderr chunk (state=3): >>><<< 28023 1726853615.56094: stdout chunk (state=3): >>><<< 28023 1726853615.56118: done transferring module to remote 28023 1726853615.56125: _low_level_execute_command(): starting 28023 1726853615.56129: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853615.4839785-28369-233505722133735/ /root/.ansible/tmp/ansible-tmp-1726853615.4839785-28369-233505722133735/AnsiballZ_command.py && sleep 0' 28023 1726853615.56748: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853615.56752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853615.56755: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853615.56778: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853615.56784: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853615.56865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853615.58694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853615.58727: stderr chunk (state=3): >>><<< 28023 1726853615.58730: stdout chunk (state=3): >>><<< 28023 1726853615.58743: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853615.58751: _low_level_execute_command(): starting 28023 1726853615.58753: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853615.4839785-28369-233505722133735/AnsiballZ_command.py && sleep 0' 28023 1726853615.59178: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853615.59181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853615.59184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853615.59186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853615.59234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853615.59237: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853615.59308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853615.75469: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-20 13:33:35.748591", "end": "2024-09-20 13:33:35.752464", "delta": "0:00:00.003873", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28023 1726853615.77058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853615.77090: stderr chunk (state=3): >>><<< 28023 1726853615.77093: stdout chunk (state=3): >>><<< 28023 1726853615.77107: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-20 13:33:35.748591", "end": "2024-09-20 13:33:35.752464", "delta": "0:00:00.003873", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853615.77128: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853615.4839785-28369-233505722133735/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853615.77132: _low_level_execute_command(): starting 28023 1726853615.77137: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853615.4839785-28369-233505722133735/ > /dev/null 2>&1 && sleep 0' 28023 1726853615.77761: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853615.77765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853615.77767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853615.77769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853615.77774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853615.77815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853615.77818: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853615.77824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853615.77893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853615.79978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853615.79982: stdout chunk (state=3): >>><<< 28023 1726853615.79984: stderr chunk (state=3): >>><<< 28023 1726853615.79986: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853615.79993: handler run complete 28023 1726853615.79995: Evaluated conditional (False): False 28023 1726853615.79997: attempt loop complete, returning result 28023 1726853615.79999: variable 'item' from source: unknown 28023 1726853615.80001: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set ethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest0", "up" ], "delta": "0:00:00.003873", "end": "2024-09-20 13:33:35.752464", "item": "ip link set ethtest0 up", "rc": 0, "start": "2024-09-20 13:33:35.748591" } 28023 1726853615.80097: dumping result to json 28023 1726853615.80100: done dumping result, returning 28023 1726853615.80102: done running TaskExecutor() for managed_node3/TASK: Create veth interface ethtest0 [02083763-bbaf-fdb6-dad7-00000000016e] 28023 1726853615.80103: sending task result for task 02083763-bbaf-fdb6-dad7-00000000016e 28023 1726853615.80146: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000016e 28023 1726853615.80148: WORKER PROCESS EXITING 28023 1726853615.80208: no more pending results, returning what we have 28023 1726853615.80211: results queue empty 28023 1726853615.80212: checking for any_errors_fatal 28023 1726853615.80217: done checking for any_errors_fatal 28023 1726853615.80217: checking for max_fail_percentage 28023 1726853615.80219: done checking for max_fail_percentage 28023 1726853615.80219: checking to see if all hosts have failed and the running result is not ok 28023 1726853615.80220: done checking to see if all hosts have failed 28023 1726853615.80221: getting the remaining hosts for this loop 28023 1726853615.80222: done getting the remaining hosts for this loop 28023 1726853615.80226: getting the next task for host managed_node3 28023 1726853615.80231: done getting next task for host managed_node3 28023 1726853615.80235: ^ task is: TASK: Set up veth as managed by NetworkManager 28023 1726853615.80237: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853615.80241: getting variables 28023 1726853615.80242: in VariableManager get_vars() 28023 1726853615.80283: Calling all_inventory to load vars for managed_node3 28023 1726853615.80286: Calling groups_inventory to load vars for managed_node3 28023 1726853615.80288: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853615.80298: Calling all_plugins_play to load vars for managed_node3 28023 1726853615.80300: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853615.80303: Calling groups_plugins_play to load vars for managed_node3 28023 1726853615.80525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853615.80756: done with get_vars() 28023 1726853615.80768: done getting variables 28023 1726853615.80835: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 13:33:35 -0400 (0:00:01.133) 0:00:07.892 ****** 28023 1726853615.80861: entering _queue_task() for managed_node3/command 28023 1726853615.81203: worker is 1 (out of 1 available) 28023 1726853615.81213: exiting _queue_task() for managed_node3/command 28023 1726853615.81226: done queuing things up, now waiting for results queue to drain 28023 1726853615.81227: waiting for pending results... 28023 1726853615.81691: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 28023 1726853615.81695: in run() - task 02083763-bbaf-fdb6-dad7-00000000016f 28023 1726853615.81698: variable 'ansible_search_path' from source: unknown 28023 1726853615.81701: variable 'ansible_search_path' from source: unknown 28023 1726853615.81703: calling self._execute() 28023 1726853615.81714: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853615.81724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853615.81737: variable 'omit' from source: magic vars 28023 1726853615.82161: variable 'ansible_distribution_major_version' from source: facts 28023 1726853615.82181: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853615.82358: variable 'type' from source: set_fact 28023 1726853615.82369: variable 'state' from source: include params 28023 1726853615.82382: Evaluated conditional (type == 'veth' and state == 'present'): True 28023 1726853615.82393: variable 'omit' from source: magic vars 28023 1726853615.82480: variable 'omit' from source: magic vars 28023 1726853615.82642: variable 'interface' from source: set_fact 28023 1726853615.82668: variable 'omit' from source: magic vars 28023 1726853615.82724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853615.82749: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853615.82774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853615.82788: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853615.82797: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853615.82820: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853615.82823: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853615.82826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853615.82905: Set connection var ansible_shell_type to sh 28023 1726853615.82912: Set connection var ansible_shell_executable to /bin/sh 28023 1726853615.82918: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853615.82923: Set connection var ansible_connection to ssh 28023 1726853615.82928: Set connection var ansible_pipelining to False 28023 1726853615.82933: Set connection var ansible_timeout to 10 28023 1726853615.82953: variable 'ansible_shell_executable' from source: unknown 28023 1726853615.82958: variable 'ansible_connection' from source: unknown 28023 1726853615.82961: variable 'ansible_module_compression' from source: unknown 28023 1726853615.82963: variable 'ansible_shell_type' from source: unknown 28023 1726853615.82966: variable 'ansible_shell_executable' from source: unknown 28023 1726853615.82968: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853615.82979: variable 'ansible_pipelining' from source: unknown 28023 1726853615.82982: variable 'ansible_timeout' from source: unknown 28023 1726853615.82984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853615.83077: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853615.83087: variable 'omit' from source: magic vars 28023 1726853615.83093: starting attempt loop 28023 1726853615.83096: running the handler 28023 1726853615.83110: _low_level_execute_command(): starting 28023 1726853615.83117: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853615.83613: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853615.83617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853615.83620: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853615.83624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853615.83666: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853615.83687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853615.83751: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853615.85476: stdout chunk (state=3): >>>/root <<< 28023 1726853615.85600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853615.85632: stderr chunk (state=3): >>><<< 28023 1726853615.85650: stdout chunk (state=3): >>><<< 28023 1726853615.85675: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853615.85783: _low_level_execute_command(): starting 28023 1726853615.85787: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853615.8568234-28430-266062384555769 `" && echo ansible-tmp-1726853615.8568234-28430-266062384555769="` echo /root/.ansible/tmp/ansible-tmp-1726853615.8568234-28430-266062384555769 `" ) && sleep 0' 28023 1726853615.86359: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853615.86375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853615.86392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853615.86424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853615.86441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853615.86451: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853615.86539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853615.86575: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853615.86591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853615.86613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853615.86712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853615.88928: stdout chunk (state=3): >>>ansible-tmp-1726853615.8568234-28430-266062384555769=/root/.ansible/tmp/ansible-tmp-1726853615.8568234-28430-266062384555769 <<< 28023 1726853615.88933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853615.88961: stderr chunk (state=3): >>><<< 28023 1726853615.88964: stdout chunk (state=3): >>><<< 28023 1726853615.89016: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853615.8568234-28430-266062384555769=/root/.ansible/tmp/ansible-tmp-1726853615.8568234-28430-266062384555769 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853615.89038: variable 'ansible_module_compression' from source: unknown 28023 1726853615.89101: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28023 1726853615.89147: variable 'ansible_facts' from source: unknown 28023 1726853615.89377: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853615.8568234-28430-266062384555769/AnsiballZ_command.py 28023 1726853615.89380: Sending initial data 28023 1726853615.89382: Sent initial data (156 bytes) 28023 1726853615.90098: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853615.90135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853615.90155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853615.90173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853615.90266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853615.91933: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853615.92014: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853615.92083: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpxsbykl0_ /root/.ansible/tmp/ansible-tmp-1726853615.8568234-28430-266062384555769/AnsiballZ_command.py <<< 28023 1726853615.92094: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853615.8568234-28430-266062384555769/AnsiballZ_command.py" <<< 28023 1726853615.92155: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpxsbykl0_" to remote "/root/.ansible/tmp/ansible-tmp-1726853615.8568234-28430-266062384555769/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853615.8568234-28430-266062384555769/AnsiballZ_command.py" <<< 28023 1726853615.93061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853615.93146: stdout chunk (state=3): >>><<< 28023 1726853615.93149: stderr chunk (state=3): >>><<< 28023 1726853615.93152: done transferring module to remote 28023 1726853615.93177: _low_level_execute_command(): starting 28023 1726853615.93187: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853615.8568234-28430-266062384555769/ /root/.ansible/tmp/ansible-tmp-1726853615.8568234-28430-266062384555769/AnsiballZ_command.py && sleep 0' 28023 1726853615.93856: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853615.93874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853615.93937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853615.94005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853615.94029: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853615.94076: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853615.94166: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853615.96083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853615.96087: stdout chunk (state=3): >>><<< 28023 1726853615.96090: stderr chunk (state=3): >>><<< 28023 1726853615.96110: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853615.96200: _low_level_execute_command(): starting 28023 1726853615.96204: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853615.8568234-28430-266062384555769/AnsiballZ_command.py && sleep 0' 28023 1726853615.97019: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853615.97027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853615.97034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28023 1726853615.97036: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853615.97039: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853615.97111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853615.97181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853616.14926: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-20 13:33:36.127557", "end": "2024-09-20 13:33:36.147997", "delta": "0:00:00.020440", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28023 1726853616.16812: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853616.16821: stdout chunk (state=3): >>><<< 28023 1726853616.16824: stderr chunk (state=3): >>><<< 28023 1726853616.16827: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-20 13:33:36.127557", "end": "2024-09-20 13:33:36.147997", "delta": "0:00:00.020440", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853616.16829: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853615.8568234-28430-266062384555769/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853616.16832: _low_level_execute_command(): starting 28023 1726853616.16834: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853615.8568234-28430-266062384555769/ > /dev/null 2>&1 && sleep 0' 28023 1726853616.17424: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853616.17438: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853616.17455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853616.17483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853616.17589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853616.17632: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853616.17688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853616.19677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853616.19690: stdout chunk (state=3): >>><<< 28023 1726853616.19703: stderr chunk (state=3): >>><<< 28023 1726853616.19724: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853616.19736: handler run complete 28023 1726853616.19767: Evaluated conditional (False): False 28023 1726853616.19785: attempt loop complete, returning result 28023 1726853616.19791: _execute() done 28023 1726853616.19798: dumping result to json 28023 1726853616.19806: done dumping result, returning 28023 1726853616.19817: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [02083763-bbaf-fdb6-dad7-00000000016f] 28023 1726853616.19825: sending task result for task 02083763-bbaf-fdb6-dad7-00000000016f ok: [managed_node3] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest0", "managed", "true" ], "delta": "0:00:00.020440", "end": "2024-09-20 13:33:36.147997", "rc": 0, "start": "2024-09-20 13:33:36.127557" } 28023 1726853616.20025: no more pending results, returning what we have 28023 1726853616.20028: results queue empty 28023 1726853616.20029: checking for any_errors_fatal 28023 1726853616.20039: done checking for any_errors_fatal 28023 1726853616.20039: checking for max_fail_percentage 28023 1726853616.20041: done checking for max_fail_percentage 28023 1726853616.20041: checking to see if all hosts have failed and the running result is not ok 28023 1726853616.20042: done checking to see if all hosts have failed 28023 1726853616.20043: getting the remaining hosts for this loop 28023 1726853616.20044: done getting the remaining hosts for this loop 28023 1726853616.20047: getting the next task for host managed_node3 28023 1726853616.20053: done getting next task for host managed_node3 28023 1726853616.20056: ^ task is: TASK: Delete veth interface {{ interface }} 28023 1726853616.20059: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853616.20064: getting variables 28023 1726853616.20066: in VariableManager get_vars() 28023 1726853616.20107: Calling all_inventory to load vars for managed_node3 28023 1726853616.20109: Calling groups_inventory to load vars for managed_node3 28023 1726853616.20111: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853616.20123: Calling all_plugins_play to load vars for managed_node3 28023 1726853616.20125: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853616.20128: Calling groups_plugins_play to load vars for managed_node3 28023 1726853616.20451: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000016f 28023 1726853616.20455: WORKER PROCESS EXITING 28023 1726853616.20484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853616.20703: done with get_vars() 28023 1726853616.20715: done getting variables 28023 1726853616.20778: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853616.20913: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest0] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 13:33:36 -0400 (0:00:00.400) 0:00:08.293 ****** 28023 1726853616.20943: entering _queue_task() for managed_node3/command 28023 1726853616.21295: worker is 1 (out of 1 available) 28023 1726853616.21306: exiting _queue_task() for managed_node3/command 28023 1726853616.21324: done queuing things up, now waiting for results queue to drain 28023 1726853616.21326: waiting for pending results... 28023 1726853616.21585: running TaskExecutor() for managed_node3/TASK: Delete veth interface ethtest0 28023 1726853616.21686: in run() - task 02083763-bbaf-fdb6-dad7-000000000170 28023 1726853616.21710: variable 'ansible_search_path' from source: unknown 28023 1726853616.21717: variable 'ansible_search_path' from source: unknown 28023 1726853616.21755: calling self._execute() 28023 1726853616.21849: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853616.21863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853616.21879: variable 'omit' from source: magic vars 28023 1726853616.22300: variable 'ansible_distribution_major_version' from source: facts 28023 1726853616.22316: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853616.22522: variable 'type' from source: set_fact 28023 1726853616.22531: variable 'state' from source: include params 28023 1726853616.22539: variable 'interface' from source: set_fact 28023 1726853616.22546: variable 'current_interfaces' from source: set_fact 28023 1726853616.22561: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 28023 1726853616.22573: when evaluation is False, skipping this task 28023 1726853616.22581: _execute() done 28023 1726853616.22588: dumping result to json 28023 1726853616.22594: done dumping result, returning 28023 1726853616.22604: done running TaskExecutor() for managed_node3/TASK: Delete veth interface ethtest0 [02083763-bbaf-fdb6-dad7-000000000170] 28023 1726853616.22613: sending task result for task 02083763-bbaf-fdb6-dad7-000000000170 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 28023 1726853616.22748: no more pending results, returning what we have 28023 1726853616.22752: results queue empty 28023 1726853616.22753: checking for any_errors_fatal 28023 1726853616.22763: done checking for any_errors_fatal 28023 1726853616.22764: checking for max_fail_percentage 28023 1726853616.22766: done checking for max_fail_percentage 28023 1726853616.22767: checking to see if all hosts have failed and the running result is not ok 28023 1726853616.22768: done checking to see if all hosts have failed 28023 1726853616.22769: getting the remaining hosts for this loop 28023 1726853616.22772: done getting the remaining hosts for this loop 28023 1726853616.22775: getting the next task for host managed_node3 28023 1726853616.22783: done getting next task for host managed_node3 28023 1726853616.22785: ^ task is: TASK: Create dummy interface {{ interface }} 28023 1726853616.22789: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853616.22794: getting variables 28023 1726853616.22795: in VariableManager get_vars() 28023 1726853616.22843: Calling all_inventory to load vars for managed_node3 28023 1726853616.22846: Calling groups_inventory to load vars for managed_node3 28023 1726853616.22848: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853616.22867: Calling all_plugins_play to load vars for managed_node3 28023 1726853616.22870: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853616.23085: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000170 28023 1726853616.23089: WORKER PROCESS EXITING 28023 1726853616.23094: Calling groups_plugins_play to load vars for managed_node3 28023 1726853616.23506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853616.23933: done with get_vars() 28023 1726853616.23943: done getting variables 28023 1726853616.24023: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853616.24150: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest0] ***************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 13:33:36 -0400 (0:00:00.032) 0:00:08.325 ****** 28023 1726853616.24193: entering _queue_task() for managed_node3/command 28023 1726853616.24733: worker is 1 (out of 1 available) 28023 1726853616.24745: exiting _queue_task() for managed_node3/command 28023 1726853616.24759: done queuing things up, now waiting for results queue to drain 28023 1726853616.24761: waiting for pending results... 28023 1726853616.25089: running TaskExecutor() for managed_node3/TASK: Create dummy interface ethtest0 28023 1726853616.25096: in run() - task 02083763-bbaf-fdb6-dad7-000000000171 28023 1726853616.25113: variable 'ansible_search_path' from source: unknown 28023 1726853616.25121: variable 'ansible_search_path' from source: unknown 28023 1726853616.25173: calling self._execute() 28023 1726853616.25267: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853616.25283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853616.25301: variable 'omit' from source: magic vars 28023 1726853616.25663: variable 'ansible_distribution_major_version' from source: facts 28023 1726853616.25683: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853616.26009: variable 'type' from source: set_fact 28023 1726853616.26019: variable 'state' from source: include params 28023 1726853616.26027: variable 'interface' from source: set_fact 28023 1726853616.26036: variable 'current_interfaces' from source: set_fact 28023 1726853616.26048: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 28023 1726853616.26084: when evaluation is False, skipping this task 28023 1726853616.26092: _execute() done 28023 1726853616.26100: dumping result to json 28023 1726853616.26140: done dumping result, returning 28023 1726853616.26277: done running TaskExecutor() for managed_node3/TASK: Create dummy interface ethtest0 [02083763-bbaf-fdb6-dad7-000000000171] 28023 1726853616.26280: sending task result for task 02083763-bbaf-fdb6-dad7-000000000171 28023 1726853616.26351: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000171 28023 1726853616.26354: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 28023 1726853616.26421: no more pending results, returning what we have 28023 1726853616.26425: results queue empty 28023 1726853616.26427: checking for any_errors_fatal 28023 1726853616.26433: done checking for any_errors_fatal 28023 1726853616.26434: checking for max_fail_percentage 28023 1726853616.26436: done checking for max_fail_percentage 28023 1726853616.26436: checking to see if all hosts have failed and the running result is not ok 28023 1726853616.26437: done checking to see if all hosts have failed 28023 1726853616.26438: getting the remaining hosts for this loop 28023 1726853616.26440: done getting the remaining hosts for this loop 28023 1726853616.26444: getting the next task for host managed_node3 28023 1726853616.26451: done getting next task for host managed_node3 28023 1726853616.26454: ^ task is: TASK: Delete dummy interface {{ interface }} 28023 1726853616.26460: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853616.26466: getting variables 28023 1726853616.26467: in VariableManager get_vars() 28023 1726853616.26515: Calling all_inventory to load vars for managed_node3 28023 1726853616.26518: Calling groups_inventory to load vars for managed_node3 28023 1726853616.26520: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853616.26535: Calling all_plugins_play to load vars for managed_node3 28023 1726853616.26538: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853616.26541: Calling groups_plugins_play to load vars for managed_node3 28023 1726853616.26939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853616.27207: done with get_vars() 28023 1726853616.27218: done getting variables 28023 1726853616.27275: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853616.27380: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest0] ***************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 13:33:36 -0400 (0:00:00.032) 0:00:08.357 ****** 28023 1726853616.27407: entering _queue_task() for managed_node3/command 28023 1726853616.27682: worker is 1 (out of 1 available) 28023 1726853616.27695: exiting _queue_task() for managed_node3/command 28023 1726853616.27706: done queuing things up, now waiting for results queue to drain 28023 1726853616.27708: waiting for pending results... 28023 1726853616.28093: running TaskExecutor() for managed_node3/TASK: Delete dummy interface ethtest0 28023 1726853616.28098: in run() - task 02083763-bbaf-fdb6-dad7-000000000172 28023 1726853616.28101: variable 'ansible_search_path' from source: unknown 28023 1726853616.28172: variable 'ansible_search_path' from source: unknown 28023 1726853616.28387: calling self._execute() 28023 1726853616.28497: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853616.28516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853616.28530: variable 'omit' from source: magic vars 28023 1726853616.28916: variable 'ansible_distribution_major_version' from source: facts 28023 1726853616.28939: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853616.29167: variable 'type' from source: set_fact 28023 1726853616.29180: variable 'state' from source: include params 28023 1726853616.29188: variable 'interface' from source: set_fact 28023 1726853616.29195: variable 'current_interfaces' from source: set_fact 28023 1726853616.29206: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 28023 1726853616.29213: when evaluation is False, skipping this task 28023 1726853616.29220: _execute() done 28023 1726853616.29228: dumping result to json 28023 1726853616.29260: done dumping result, returning 28023 1726853616.29264: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface ethtest0 [02083763-bbaf-fdb6-dad7-000000000172] 28023 1726853616.29266: sending task result for task 02083763-bbaf-fdb6-dad7-000000000172 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 28023 1726853616.29615: no more pending results, returning what we have 28023 1726853616.29618: results queue empty 28023 1726853616.29619: checking for any_errors_fatal 28023 1726853616.29625: done checking for any_errors_fatal 28023 1726853616.29626: checking for max_fail_percentage 28023 1726853616.29627: done checking for max_fail_percentage 28023 1726853616.29628: checking to see if all hosts have failed and the running result is not ok 28023 1726853616.29629: done checking to see if all hosts have failed 28023 1726853616.29630: getting the remaining hosts for this loop 28023 1726853616.29631: done getting the remaining hosts for this loop 28023 1726853616.29634: getting the next task for host managed_node3 28023 1726853616.29640: done getting next task for host managed_node3 28023 1726853616.29643: ^ task is: TASK: Create tap interface {{ interface }} 28023 1726853616.29646: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853616.29649: getting variables 28023 1726853616.29651: in VariableManager get_vars() 28023 1726853616.29692: Calling all_inventory to load vars for managed_node3 28023 1726853616.29695: Calling groups_inventory to load vars for managed_node3 28023 1726853616.29698: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853616.29708: Calling all_plugins_play to load vars for managed_node3 28023 1726853616.29711: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853616.29714: Calling groups_plugins_play to load vars for managed_node3 28023 1726853616.30052: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000172 28023 1726853616.30058: WORKER PROCESS EXITING 28023 1726853616.30085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853616.30330: done with get_vars() 28023 1726853616.30342: done getting variables 28023 1726853616.30404: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853616.30520: variable 'interface' from source: set_fact TASK [Create tap interface ethtest0] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 13:33:36 -0400 (0:00:00.031) 0:00:08.389 ****** 28023 1726853616.30554: entering _queue_task() for managed_node3/command 28023 1726853616.30909: worker is 1 (out of 1 available) 28023 1726853616.30921: exiting _queue_task() for managed_node3/command 28023 1726853616.30933: done queuing things up, now waiting for results queue to drain 28023 1726853616.30935: waiting for pending results... 28023 1726853616.31161: running TaskExecutor() for managed_node3/TASK: Create tap interface ethtest0 28023 1726853616.31267: in run() - task 02083763-bbaf-fdb6-dad7-000000000173 28023 1726853616.31291: variable 'ansible_search_path' from source: unknown 28023 1726853616.31301: variable 'ansible_search_path' from source: unknown 28023 1726853616.31420: calling self._execute() 28023 1726853616.31449: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853616.31464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853616.31479: variable 'omit' from source: magic vars 28023 1726853616.31843: variable 'ansible_distribution_major_version' from source: facts 28023 1726853616.31870: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853616.32089: variable 'type' from source: set_fact 28023 1726853616.32098: variable 'state' from source: include params 28023 1726853616.32104: variable 'interface' from source: set_fact 28023 1726853616.32111: variable 'current_interfaces' from source: set_fact 28023 1726853616.32179: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 28023 1726853616.32182: when evaluation is False, skipping this task 28023 1726853616.32184: _execute() done 28023 1726853616.32186: dumping result to json 28023 1726853616.32188: done dumping result, returning 28023 1726853616.32190: done running TaskExecutor() for managed_node3/TASK: Create tap interface ethtest0 [02083763-bbaf-fdb6-dad7-000000000173] 28023 1726853616.32191: sending task result for task 02083763-bbaf-fdb6-dad7-000000000173 28023 1726853616.32251: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000173 28023 1726853616.32254: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 28023 1726853616.32329: no more pending results, returning what we have 28023 1726853616.32332: results queue empty 28023 1726853616.32333: checking for any_errors_fatal 28023 1726853616.32340: done checking for any_errors_fatal 28023 1726853616.32341: checking for max_fail_percentage 28023 1726853616.32343: done checking for max_fail_percentage 28023 1726853616.32343: checking to see if all hosts have failed and the running result is not ok 28023 1726853616.32344: done checking to see if all hosts have failed 28023 1726853616.32345: getting the remaining hosts for this loop 28023 1726853616.32347: done getting the remaining hosts for this loop 28023 1726853616.32350: getting the next task for host managed_node3 28023 1726853616.32360: done getting next task for host managed_node3 28023 1726853616.32363: ^ task is: TASK: Delete tap interface {{ interface }} 28023 1726853616.32372: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853616.32377: getting variables 28023 1726853616.32379: in VariableManager get_vars() 28023 1726853616.32422: Calling all_inventory to load vars for managed_node3 28023 1726853616.32425: Calling groups_inventory to load vars for managed_node3 28023 1726853616.32427: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853616.32442: Calling all_plugins_play to load vars for managed_node3 28023 1726853616.32445: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853616.32447: Calling groups_plugins_play to load vars for managed_node3 28023 1726853616.33127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853616.33352: done with get_vars() 28023 1726853616.33364: done getting variables 28023 1726853616.33423: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853616.33534: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest0] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 13:33:36 -0400 (0:00:00.030) 0:00:08.419 ****** 28023 1726853616.33564: entering _queue_task() for managed_node3/command 28023 1726853616.33890: worker is 1 (out of 1 available) 28023 1726853616.33900: exiting _queue_task() for managed_node3/command 28023 1726853616.33911: done queuing things up, now waiting for results queue to drain 28023 1726853616.33912: waiting for pending results... 28023 1726853616.34293: running TaskExecutor() for managed_node3/TASK: Delete tap interface ethtest0 28023 1726853616.34300: in run() - task 02083763-bbaf-fdb6-dad7-000000000174 28023 1726853616.34302: variable 'ansible_search_path' from source: unknown 28023 1726853616.34304: variable 'ansible_search_path' from source: unknown 28023 1726853616.34322: calling self._execute() 28023 1726853616.34422: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853616.34433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853616.34445: variable 'omit' from source: magic vars 28023 1726853616.34814: variable 'ansible_distribution_major_version' from source: facts 28023 1726853616.34838: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853616.35053: variable 'type' from source: set_fact 28023 1726853616.35165: variable 'state' from source: include params 28023 1726853616.35168: variable 'interface' from source: set_fact 28023 1726853616.35173: variable 'current_interfaces' from source: set_fact 28023 1726853616.35176: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 28023 1726853616.35179: when evaluation is False, skipping this task 28023 1726853616.35181: _execute() done 28023 1726853616.35183: dumping result to json 28023 1726853616.35185: done dumping result, returning 28023 1726853616.35187: done running TaskExecutor() for managed_node3/TASK: Delete tap interface ethtest0 [02083763-bbaf-fdb6-dad7-000000000174] 28023 1726853616.35189: sending task result for task 02083763-bbaf-fdb6-dad7-000000000174 28023 1726853616.35253: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000174 28023 1726853616.35260: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 28023 1726853616.35311: no more pending results, returning what we have 28023 1726853616.35315: results queue empty 28023 1726853616.35316: checking for any_errors_fatal 28023 1726853616.35325: done checking for any_errors_fatal 28023 1726853616.35326: checking for max_fail_percentage 28023 1726853616.35328: done checking for max_fail_percentage 28023 1726853616.35328: checking to see if all hosts have failed and the running result is not ok 28023 1726853616.35329: done checking to see if all hosts have failed 28023 1726853616.35330: getting the remaining hosts for this loop 28023 1726853616.35332: done getting the remaining hosts for this loop 28023 1726853616.35335: getting the next task for host managed_node3 28023 1726853616.35344: done getting next task for host managed_node3 28023 1726853616.35348: ^ task is: TASK: Assert device is present 28023 1726853616.35351: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853616.35359: getting variables 28023 1726853616.35360: in VariableManager get_vars() 28023 1726853616.35404: Calling all_inventory to load vars for managed_node3 28023 1726853616.35407: Calling groups_inventory to load vars for managed_node3 28023 1726853616.35410: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853616.35425: Calling all_plugins_play to load vars for managed_node3 28023 1726853616.35428: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853616.35431: Calling groups_plugins_play to load vars for managed_node3 28023 1726853616.35854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853616.36100: done with get_vars() 28023 1726853616.36112: done getting variables TASK [Assert device is present] ************************************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:21 Friday 20 September 2024 13:33:36 -0400 (0:00:00.026) 0:00:08.446 ****** 28023 1726853616.36212: entering _queue_task() for managed_node3/include_tasks 28023 1726853616.36602: worker is 1 (out of 1 available) 28023 1726853616.36614: exiting _queue_task() for managed_node3/include_tasks 28023 1726853616.36626: done queuing things up, now waiting for results queue to drain 28023 1726853616.36628: waiting for pending results... 28023 1726853616.36986: running TaskExecutor() for managed_node3/TASK: Assert device is present 28023 1726853616.36991: in run() - task 02083763-bbaf-fdb6-dad7-00000000000e 28023 1726853616.36994: variable 'ansible_search_path' from source: unknown 28023 1726853616.36997: calling self._execute() 28023 1726853616.37093: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853616.37106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853616.37129: variable 'omit' from source: magic vars 28023 1726853616.37520: variable 'ansible_distribution_major_version' from source: facts 28023 1726853616.37538: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853616.37663: _execute() done 28023 1726853616.37667: dumping result to json 28023 1726853616.37669: done dumping result, returning 28023 1726853616.37674: done running TaskExecutor() for managed_node3/TASK: Assert device is present [02083763-bbaf-fdb6-dad7-00000000000e] 28023 1726853616.37677: sending task result for task 02083763-bbaf-fdb6-dad7-00000000000e 28023 1726853616.37746: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000000e 28023 1726853616.37749: WORKER PROCESS EXITING 28023 1726853616.37797: no more pending results, returning what we have 28023 1726853616.37803: in VariableManager get_vars() 28023 1726853616.37861: Calling all_inventory to load vars for managed_node3 28023 1726853616.37865: Calling groups_inventory to load vars for managed_node3 28023 1726853616.37867: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853616.37889: Calling all_plugins_play to load vars for managed_node3 28023 1726853616.37893: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853616.37897: Calling groups_plugins_play to load vars for managed_node3 28023 1726853616.38394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853616.38609: done with get_vars() 28023 1726853616.38616: variable 'ansible_search_path' from source: unknown 28023 1726853616.38628: we have included files to process 28023 1726853616.38629: generating all_blocks data 28023 1726853616.38635: done generating all_blocks data 28023 1726853616.38639: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 28023 1726853616.38640: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 28023 1726853616.38642: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 28023 1726853616.38797: in VariableManager get_vars() 28023 1726853616.38816: done with get_vars() 28023 1726853616.38924: done processing included file 28023 1726853616.38926: iterating over new_blocks loaded from include file 28023 1726853616.38927: in VariableManager get_vars() 28023 1726853616.38942: done with get_vars() 28023 1726853616.38944: filtering new block on tags 28023 1726853616.38966: done filtering new block on tags 28023 1726853616.38968: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 28023 1726853616.38974: extending task lists for all hosts with included blocks 28023 1726853616.39589: done extending task lists 28023 1726853616.39590: done processing included files 28023 1726853616.39591: results queue empty 28023 1726853616.39592: checking for any_errors_fatal 28023 1726853616.39594: done checking for any_errors_fatal 28023 1726853616.39595: checking for max_fail_percentage 28023 1726853616.39596: done checking for max_fail_percentage 28023 1726853616.39597: checking to see if all hosts have failed and the running result is not ok 28023 1726853616.39598: done checking to see if all hosts have failed 28023 1726853616.39598: getting the remaining hosts for this loop 28023 1726853616.39600: done getting the remaining hosts for this loop 28023 1726853616.39602: getting the next task for host managed_node3 28023 1726853616.39611: done getting next task for host managed_node3 28023 1726853616.39613: ^ task is: TASK: Include the task 'get_interface_stat.yml' 28023 1726853616.39615: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853616.39618: getting variables 28023 1726853616.39619: in VariableManager get_vars() 28023 1726853616.39632: Calling all_inventory to load vars for managed_node3 28023 1726853616.39634: Calling groups_inventory to load vars for managed_node3 28023 1726853616.39636: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853616.39641: Calling all_plugins_play to load vars for managed_node3 28023 1726853616.39643: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853616.39645: Calling groups_plugins_play to load vars for managed_node3 28023 1726853616.39838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853616.40051: done with get_vars() 28023 1726853616.40063: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:33:36 -0400 (0:00:00.039) 0:00:08.485 ****** 28023 1726853616.40134: entering _queue_task() for managed_node3/include_tasks 28023 1726853616.40442: worker is 1 (out of 1 available) 28023 1726853616.40453: exiting _queue_task() for managed_node3/include_tasks 28023 1726853616.40468: done queuing things up, now waiting for results queue to drain 28023 1726853616.40470: waiting for pending results... 28023 1726853616.40744: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 28023 1726853616.40891: in run() - task 02083763-bbaf-fdb6-dad7-000000000214 28023 1726853616.40896: variable 'ansible_search_path' from source: unknown 28023 1726853616.40899: variable 'ansible_search_path' from source: unknown 28023 1726853616.41078: calling self._execute() 28023 1726853616.41081: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853616.41084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853616.41087: variable 'omit' from source: magic vars 28023 1726853616.41450: variable 'ansible_distribution_major_version' from source: facts 28023 1726853616.41469: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853616.41480: _execute() done 28023 1726853616.41487: dumping result to json 28023 1726853616.41493: done dumping result, returning 28023 1726853616.41502: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-fdb6-dad7-000000000214] 28023 1726853616.41510: sending task result for task 02083763-bbaf-fdb6-dad7-000000000214 28023 1726853616.41660: no more pending results, returning what we have 28023 1726853616.41665: in VariableManager get_vars() 28023 1726853616.41715: Calling all_inventory to load vars for managed_node3 28023 1726853616.41718: Calling groups_inventory to load vars for managed_node3 28023 1726853616.41720: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853616.41735: Calling all_plugins_play to load vars for managed_node3 28023 1726853616.41738: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853616.41741: Calling groups_plugins_play to load vars for managed_node3 28023 1726853616.42150: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000214 28023 1726853616.42154: WORKER PROCESS EXITING 28023 1726853616.42184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853616.42454: done with get_vars() 28023 1726853616.42465: variable 'ansible_search_path' from source: unknown 28023 1726853616.42466: variable 'ansible_search_path' from source: unknown 28023 1726853616.42502: we have included files to process 28023 1726853616.42504: generating all_blocks data 28023 1726853616.42505: done generating all_blocks data 28023 1726853616.42506: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28023 1726853616.42507: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28023 1726853616.42509: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28023 1726853616.42749: done processing included file 28023 1726853616.42751: iterating over new_blocks loaded from include file 28023 1726853616.42753: in VariableManager get_vars() 28023 1726853616.42777: done with get_vars() 28023 1726853616.42778: filtering new block on tags 28023 1726853616.42794: done filtering new block on tags 28023 1726853616.42796: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 28023 1726853616.42801: extending task lists for all hosts with included blocks 28023 1726853616.42903: done extending task lists 28023 1726853616.42904: done processing included files 28023 1726853616.42905: results queue empty 28023 1726853616.42906: checking for any_errors_fatal 28023 1726853616.42908: done checking for any_errors_fatal 28023 1726853616.42909: checking for max_fail_percentage 28023 1726853616.42910: done checking for max_fail_percentage 28023 1726853616.42911: checking to see if all hosts have failed and the running result is not ok 28023 1726853616.42912: done checking to see if all hosts have failed 28023 1726853616.42912: getting the remaining hosts for this loop 28023 1726853616.42914: done getting the remaining hosts for this loop 28023 1726853616.42916: getting the next task for host managed_node3 28023 1726853616.42920: done getting next task for host managed_node3 28023 1726853616.42922: ^ task is: TASK: Get stat for interface {{ interface }} 28023 1726853616.42924: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853616.42927: getting variables 28023 1726853616.42928: in VariableManager get_vars() 28023 1726853616.42941: Calling all_inventory to load vars for managed_node3 28023 1726853616.42943: Calling groups_inventory to load vars for managed_node3 28023 1726853616.42945: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853616.42954: Calling all_plugins_play to load vars for managed_node3 28023 1726853616.42959: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853616.42962: Calling groups_plugins_play to load vars for managed_node3 28023 1726853616.43148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853616.43374: done with get_vars() 28023 1726853616.43387: done getting variables 28023 1726853616.43539: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:33:36 -0400 (0:00:00.034) 0:00:08.519 ****** 28023 1726853616.43569: entering _queue_task() for managed_node3/stat 28023 1726853616.43998: worker is 1 (out of 1 available) 28023 1726853616.44008: exiting _queue_task() for managed_node3/stat 28023 1726853616.44018: done queuing things up, now waiting for results queue to drain 28023 1726853616.44019: waiting for pending results... 28023 1726853616.44140: running TaskExecutor() for managed_node3/TASK: Get stat for interface ethtest0 28023 1726853616.44267: in run() - task 02083763-bbaf-fdb6-dad7-000000000267 28023 1726853616.44289: variable 'ansible_search_path' from source: unknown 28023 1726853616.44297: variable 'ansible_search_path' from source: unknown 28023 1726853616.44334: calling self._execute() 28023 1726853616.44430: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853616.44477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853616.44481: variable 'omit' from source: magic vars 28023 1726853616.44840: variable 'ansible_distribution_major_version' from source: facts 28023 1726853616.44858: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853616.44868: variable 'omit' from source: magic vars 28023 1726853616.44916: variable 'omit' from source: magic vars 28023 1726853616.45020: variable 'interface' from source: set_fact 28023 1726853616.45028: variable 'omit' from source: magic vars 28023 1726853616.45077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853616.45178: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853616.45181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853616.45184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853616.45201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853616.45242: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853616.45251: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853616.45262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853616.45380: Set connection var ansible_shell_type to sh 28023 1726853616.45394: Set connection var ansible_shell_executable to /bin/sh 28023 1726853616.45405: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853616.45415: Set connection var ansible_connection to ssh 28023 1726853616.45460: Set connection var ansible_pipelining to False 28023 1726853616.45464: Set connection var ansible_timeout to 10 28023 1726853616.45478: variable 'ansible_shell_executable' from source: unknown 28023 1726853616.45487: variable 'ansible_connection' from source: unknown 28023 1726853616.45495: variable 'ansible_module_compression' from source: unknown 28023 1726853616.45568: variable 'ansible_shell_type' from source: unknown 28023 1726853616.45575: variable 'ansible_shell_executable' from source: unknown 28023 1726853616.45578: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853616.45580: variable 'ansible_pipelining' from source: unknown 28023 1726853616.45582: variable 'ansible_timeout' from source: unknown 28023 1726853616.45585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853616.45761: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28023 1726853616.45786: variable 'omit' from source: magic vars 28023 1726853616.45798: starting attempt loop 28023 1726853616.45806: running the handler 28023 1726853616.45826: _low_level_execute_command(): starting 28023 1726853616.45840: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853616.46674: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853616.46692: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853616.46792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853616.46822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853616.46842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853616.46867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853616.46976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853616.49013: stdout chunk (state=3): >>>/root <<< 28023 1726853616.49201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853616.49205: stdout chunk (state=3): >>><<< 28023 1726853616.49208: stderr chunk (state=3): >>><<< 28023 1726853616.49212: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853616.49214: _low_level_execute_command(): starting 28023 1726853616.49217: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853616.4910676-28460-46942020273494 `" && echo ansible-tmp-1726853616.4910676-28460-46942020273494="` echo /root/.ansible/tmp/ansible-tmp-1726853616.4910676-28460-46942020273494 `" ) && sleep 0' 28023 1726853616.50075: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853616.50089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853616.50265: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853616.50281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853616.50362: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853616.50422: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853616.52411: stdout chunk (state=3): >>>ansible-tmp-1726853616.4910676-28460-46942020273494=/root/.ansible/tmp/ansible-tmp-1726853616.4910676-28460-46942020273494 <<< 28023 1726853616.52562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853616.52581: stdout chunk (state=3): >>><<< 28023 1726853616.52593: stderr chunk (state=3): >>><<< 28023 1726853616.52616: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853616.4910676-28460-46942020273494=/root/.ansible/tmp/ansible-tmp-1726853616.4910676-28460-46942020273494 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853616.52680: variable 'ansible_module_compression' from source: unknown 28023 1726853616.52751: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28023 1726853616.52803: variable 'ansible_facts' from source: unknown 28023 1726853616.52911: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853616.4910676-28460-46942020273494/AnsiballZ_stat.py 28023 1726853616.53306: Sending initial data 28023 1726853616.53401: Sent initial data (152 bytes) 28023 1726853616.54688: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853616.54881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853616.54885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853616.54998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853616.56912: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853616.56946: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853616.57261: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmp2e_gkaxk /root/.ansible/tmp/ansible-tmp-1726853616.4910676-28460-46942020273494/AnsiballZ_stat.py <<< 28023 1726853616.57265: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853616.4910676-28460-46942020273494/AnsiballZ_stat.py" <<< 28023 1726853616.57297: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmp2e_gkaxk" to remote "/root/.ansible/tmp/ansible-tmp-1726853616.4910676-28460-46942020273494/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853616.4910676-28460-46942020273494/AnsiballZ_stat.py" <<< 28023 1726853616.58945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853616.59023: stderr chunk (state=3): >>><<< 28023 1726853616.59035: stdout chunk (state=3): >>><<< 28023 1726853616.59277: done transferring module to remote 28023 1726853616.59280: _low_level_execute_command(): starting 28023 1726853616.59283: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853616.4910676-28460-46942020273494/ /root/.ansible/tmp/ansible-tmp-1726853616.4910676-28460-46942020273494/AnsiballZ_stat.py && sleep 0' 28023 1726853616.60369: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853616.60388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853616.60450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853616.60511: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853616.60542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853616.60591: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853616.60647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853616.62596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853616.62620: stdout chunk (state=3): >>><<< 28023 1726853616.62623: stderr chunk (state=3): >>><<< 28023 1726853616.62724: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853616.62733: _low_level_execute_command(): starting 28023 1726853616.62736: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853616.4910676-28460-46942020273494/AnsiballZ_stat.py && sleep 0' 28023 1726853616.63292: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853616.63315: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853616.63334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853616.63351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853616.63372: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853616.63430: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853616.63486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853616.63512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853616.63538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853616.63648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853616.79154: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31056, "dev": 23, "nlink": 1, "atime": 1726853614.995531, "mtime": 1726853614.995531, "ctime": 1726853614.995531, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28023 1726853616.80527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853616.80551: stderr chunk (state=3): >>><<< 28023 1726853616.80554: stdout chunk (state=3): >>><<< 28023 1726853616.80580: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31056, "dev": 23, "nlink": 1, "atime": 1726853614.995531, "mtime": 1726853614.995531, "ctime": 1726853614.995531, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853616.80613: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853616.4910676-28460-46942020273494/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853616.80621: _low_level_execute_command(): starting 28023 1726853616.80625: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853616.4910676-28460-46942020273494/ > /dev/null 2>&1 && sleep 0' 28023 1726853616.81036: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853616.81074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853616.81078: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853616.81080: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 28023 1726853616.81082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853616.81084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853616.81133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853616.81136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853616.81140: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853616.81205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853616.83170: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853616.83184: stdout chunk (state=3): >>><<< 28023 1726853616.83187: stderr chunk (state=3): >>><<< 28023 1726853616.83202: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853616.83205: handler run complete 28023 1726853616.83248: attempt loop complete, returning result 28023 1726853616.83251: _execute() done 28023 1726853616.83254: dumping result to json 28023 1726853616.83261: done dumping result, returning 28023 1726853616.83266: done running TaskExecutor() for managed_node3/TASK: Get stat for interface ethtest0 [02083763-bbaf-fdb6-dad7-000000000267] 28023 1726853616.83272: sending task result for task 02083763-bbaf-fdb6-dad7-000000000267 28023 1726853616.83384: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000267 28023 1726853616.83386: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726853614.995531, "block_size": 4096, "blocks": 0, "ctime": 1726853614.995531, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 31056, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "mode": "0777", "mtime": 1726853614.995531, "nlink": 1, "path": "/sys/class/net/ethtest0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 28023 1726853616.83491: no more pending results, returning what we have 28023 1726853616.83500: results queue empty 28023 1726853616.83501: checking for any_errors_fatal 28023 1726853616.83503: done checking for any_errors_fatal 28023 1726853616.83503: checking for max_fail_percentage 28023 1726853616.83505: done checking for max_fail_percentage 28023 1726853616.83505: checking to see if all hosts have failed and the running result is not ok 28023 1726853616.83506: done checking to see if all hosts have failed 28023 1726853616.83507: getting the remaining hosts for this loop 28023 1726853616.83508: done getting the remaining hosts for this loop 28023 1726853616.83512: getting the next task for host managed_node3 28023 1726853616.83519: done getting next task for host managed_node3 28023 1726853616.83522: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 28023 1726853616.83524: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853616.83528: getting variables 28023 1726853616.83529: in VariableManager get_vars() 28023 1726853616.83568: Calling all_inventory to load vars for managed_node3 28023 1726853616.83572: Calling groups_inventory to load vars for managed_node3 28023 1726853616.83574: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853616.83584: Calling all_plugins_play to load vars for managed_node3 28023 1726853616.83586: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853616.83588: Calling groups_plugins_play to load vars for managed_node3 28023 1726853616.83726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853616.83884: done with get_vars() 28023 1726853616.83892: done getting variables 28023 1726853616.83964: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 28023 1726853616.84050: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest0'] *********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:33:36 -0400 (0:00:00.405) 0:00:08.924 ****** 28023 1726853616.84076: entering _queue_task() for managed_node3/assert 28023 1726853616.84078: Creating lock for assert 28023 1726853616.84298: worker is 1 (out of 1 available) 28023 1726853616.84311: exiting _queue_task() for managed_node3/assert 28023 1726853616.84324: done queuing things up, now waiting for results queue to drain 28023 1726853616.84325: waiting for pending results... 28023 1726853616.84481: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'ethtest0' 28023 1726853616.84543: in run() - task 02083763-bbaf-fdb6-dad7-000000000215 28023 1726853616.84556: variable 'ansible_search_path' from source: unknown 28023 1726853616.84566: variable 'ansible_search_path' from source: unknown 28023 1726853616.84597: calling self._execute() 28023 1726853616.84668: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853616.84675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853616.84685: variable 'omit' from source: magic vars 28023 1726853616.85022: variable 'ansible_distribution_major_version' from source: facts 28023 1726853616.85176: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853616.85179: variable 'omit' from source: magic vars 28023 1726853616.85182: variable 'omit' from source: magic vars 28023 1726853616.85186: variable 'interface' from source: set_fact 28023 1726853616.85209: variable 'omit' from source: magic vars 28023 1726853616.85255: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853616.85298: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853616.85321: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853616.85342: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853616.85364: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853616.85402: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853616.85410: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853616.85417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853616.85520: Set connection var ansible_shell_type to sh 28023 1726853616.85537: Set connection var ansible_shell_executable to /bin/sh 28023 1726853616.85547: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853616.85558: Set connection var ansible_connection to ssh 28023 1726853616.85568: Set connection var ansible_pipelining to False 28023 1726853616.85580: Set connection var ansible_timeout to 10 28023 1726853616.85608: variable 'ansible_shell_executable' from source: unknown 28023 1726853616.85614: variable 'ansible_connection' from source: unknown 28023 1726853616.85619: variable 'ansible_module_compression' from source: unknown 28023 1726853616.85624: variable 'ansible_shell_type' from source: unknown 28023 1726853616.85628: variable 'ansible_shell_executable' from source: unknown 28023 1726853616.85633: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853616.85638: variable 'ansible_pipelining' from source: unknown 28023 1726853616.85642: variable 'ansible_timeout' from source: unknown 28023 1726853616.85651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853616.85793: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853616.85802: variable 'omit' from source: magic vars 28023 1726853616.85812: starting attempt loop 28023 1726853616.85815: running the handler 28023 1726853616.85949: variable 'interface_stat' from source: set_fact 28023 1726853616.85953: Evaluated conditional (interface_stat.stat.exists): True 28023 1726853616.86076: handler run complete 28023 1726853616.86079: attempt loop complete, returning result 28023 1726853616.86081: _execute() done 28023 1726853616.86083: dumping result to json 28023 1726853616.86085: done dumping result, returning 28023 1726853616.86087: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'ethtest0' [02083763-bbaf-fdb6-dad7-000000000215] 28023 1726853616.86088: sending task result for task 02083763-bbaf-fdb6-dad7-000000000215 28023 1726853616.86155: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000215 28023 1726853616.86158: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 28023 1726853616.86216: no more pending results, returning what we have 28023 1726853616.86219: results queue empty 28023 1726853616.86220: checking for any_errors_fatal 28023 1726853616.86227: done checking for any_errors_fatal 28023 1726853616.86228: checking for max_fail_percentage 28023 1726853616.86230: done checking for max_fail_percentage 28023 1726853616.86231: checking to see if all hosts have failed and the running result is not ok 28023 1726853616.86232: done checking to see if all hosts have failed 28023 1726853616.86234: getting the remaining hosts for this loop 28023 1726853616.86236: done getting the remaining hosts for this loop 28023 1726853616.86241: getting the next task for host managed_node3 28023 1726853616.86248: done getting next task for host managed_node3 28023 1726853616.86251: ^ task is: TASK: Set interface1 28023 1726853616.86252: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853616.86258: getting variables 28023 1726853616.86259: in VariableManager get_vars() 28023 1726853616.86297: Calling all_inventory to load vars for managed_node3 28023 1726853616.86299: Calling groups_inventory to load vars for managed_node3 28023 1726853616.86301: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853616.86310: Calling all_plugins_play to load vars for managed_node3 28023 1726853616.86313: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853616.86315: Calling groups_plugins_play to load vars for managed_node3 28023 1726853616.86522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853616.86750: done with get_vars() 28023 1726853616.86763: done getting variables 28023 1726853616.86829: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set interface1] ********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:23 Friday 20 September 2024 13:33:36 -0400 (0:00:00.027) 0:00:08.952 ****** 28023 1726853616.86855: entering _queue_task() for managed_node3/set_fact 28023 1726853616.87284: worker is 1 (out of 1 available) 28023 1726853616.87297: exiting _queue_task() for managed_node3/set_fact 28023 1726853616.87310: done queuing things up, now waiting for results queue to drain 28023 1726853616.87312: waiting for pending results... 28023 1726853616.87494: running TaskExecutor() for managed_node3/TASK: Set interface1 28023 1726853616.87550: in run() - task 02083763-bbaf-fdb6-dad7-00000000000f 28023 1726853616.87573: variable 'ansible_search_path' from source: unknown 28023 1726853616.87598: calling self._execute() 28023 1726853616.87669: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853616.87675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853616.87684: variable 'omit' from source: magic vars 28023 1726853616.87954: variable 'ansible_distribution_major_version' from source: facts 28023 1726853616.87968: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853616.87973: variable 'omit' from source: magic vars 28023 1726853616.87994: variable 'omit' from source: magic vars 28023 1726853616.88018: variable 'interface1' from source: play vars 28023 1726853616.88079: variable 'interface1' from source: play vars 28023 1726853616.88093: variable 'omit' from source: magic vars 28023 1726853616.88128: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853616.88154: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853616.88175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853616.88189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853616.88201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853616.88228: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853616.88231: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853616.88234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853616.88303: Set connection var ansible_shell_type to sh 28023 1726853616.88312: Set connection var ansible_shell_executable to /bin/sh 28023 1726853616.88315: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853616.88319: Set connection var ansible_connection to ssh 28023 1726853616.88324: Set connection var ansible_pipelining to False 28023 1726853616.88334: Set connection var ansible_timeout to 10 28023 1726853616.88350: variable 'ansible_shell_executable' from source: unknown 28023 1726853616.88354: variable 'ansible_connection' from source: unknown 28023 1726853616.88356: variable 'ansible_module_compression' from source: unknown 28023 1726853616.88358: variable 'ansible_shell_type' from source: unknown 28023 1726853616.88363: variable 'ansible_shell_executable' from source: unknown 28023 1726853616.88365: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853616.88369: variable 'ansible_pipelining' from source: unknown 28023 1726853616.88373: variable 'ansible_timeout' from source: unknown 28023 1726853616.88378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853616.88502: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853616.88510: variable 'omit' from source: magic vars 28023 1726853616.88515: starting attempt loop 28023 1726853616.88518: running the handler 28023 1726853616.88531: handler run complete 28023 1726853616.88538: attempt loop complete, returning result 28023 1726853616.88541: _execute() done 28023 1726853616.88543: dumping result to json 28023 1726853616.88546: done dumping result, returning 28023 1726853616.88554: done running TaskExecutor() for managed_node3/TASK: Set interface1 [02083763-bbaf-fdb6-dad7-00000000000f] 28023 1726853616.88560: sending task result for task 02083763-bbaf-fdb6-dad7-00000000000f 28023 1726853616.88640: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000000f 28023 1726853616.88643: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "interface": "ethtest1" }, "changed": false } 28023 1726853616.88699: no more pending results, returning what we have 28023 1726853616.88702: results queue empty 28023 1726853616.88703: checking for any_errors_fatal 28023 1726853616.88710: done checking for any_errors_fatal 28023 1726853616.88711: checking for max_fail_percentage 28023 1726853616.88712: done checking for max_fail_percentage 28023 1726853616.88713: checking to see if all hosts have failed and the running result is not ok 28023 1726853616.88714: done checking to see if all hosts have failed 28023 1726853616.88715: getting the remaining hosts for this loop 28023 1726853616.88717: done getting the remaining hosts for this loop 28023 1726853616.88720: getting the next task for host managed_node3 28023 1726853616.88726: done getting next task for host managed_node3 28023 1726853616.88728: ^ task is: TASK: Show interfaces 28023 1726853616.88730: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853616.88733: getting variables 28023 1726853616.88734: in VariableManager get_vars() 28023 1726853616.88787: Calling all_inventory to load vars for managed_node3 28023 1726853616.88790: Calling groups_inventory to load vars for managed_node3 28023 1726853616.88792: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853616.88801: Calling all_plugins_play to load vars for managed_node3 28023 1726853616.88803: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853616.88806: Calling groups_plugins_play to load vars for managed_node3 28023 1726853616.89020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853616.89224: done with get_vars() 28023 1726853616.89235: done getting variables TASK [Show interfaces] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:26 Friday 20 September 2024 13:33:36 -0400 (0:00:00.024) 0:00:08.977 ****** 28023 1726853616.89331: entering _queue_task() for managed_node3/include_tasks 28023 1726853616.89602: worker is 1 (out of 1 available) 28023 1726853616.89680: exiting _queue_task() for managed_node3/include_tasks 28023 1726853616.89692: done queuing things up, now waiting for results queue to drain 28023 1726853616.89693: waiting for pending results... 28023 1726853616.89945: running TaskExecutor() for managed_node3/TASK: Show interfaces 28023 1726853616.90165: in run() - task 02083763-bbaf-fdb6-dad7-000000000010 28023 1726853616.90169: variable 'ansible_search_path' from source: unknown 28023 1726853616.90174: calling self._execute() 28023 1726853616.90248: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853616.90263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853616.90285: variable 'omit' from source: magic vars 28023 1726853616.90659: variable 'ansible_distribution_major_version' from source: facts 28023 1726853616.90681: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853616.90692: _execute() done 28023 1726853616.90705: dumping result to json 28023 1726853616.90712: done dumping result, returning 28023 1726853616.90722: done running TaskExecutor() for managed_node3/TASK: Show interfaces [02083763-bbaf-fdb6-dad7-000000000010] 28023 1726853616.90731: sending task result for task 02083763-bbaf-fdb6-dad7-000000000010 28023 1726853616.91011: no more pending results, returning what we have 28023 1726853616.91016: in VariableManager get_vars() 28023 1726853616.91078: Calling all_inventory to load vars for managed_node3 28023 1726853616.91081: Calling groups_inventory to load vars for managed_node3 28023 1726853616.91083: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853616.91100: Calling all_plugins_play to load vars for managed_node3 28023 1726853616.91103: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853616.91107: Calling groups_plugins_play to load vars for managed_node3 28023 1726853616.91296: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000010 28023 1726853616.91301: WORKER PROCESS EXITING 28023 1726853616.91312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853616.91438: done with get_vars() 28023 1726853616.91444: variable 'ansible_search_path' from source: unknown 28023 1726853616.91453: we have included files to process 28023 1726853616.91454: generating all_blocks data 28023 1726853616.91455: done generating all_blocks data 28023 1726853616.91462: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28023 1726853616.91463: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28023 1726853616.91465: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28023 1726853616.91532: in VariableManager get_vars() 28023 1726853616.91547: done with get_vars() 28023 1726853616.91624: done processing included file 28023 1726853616.91626: iterating over new_blocks loaded from include file 28023 1726853616.91627: in VariableManager get_vars() 28023 1726853616.91637: done with get_vars() 28023 1726853616.91639: filtering new block on tags 28023 1726853616.91649: done filtering new block on tags 28023 1726853616.91650: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 28023 1726853616.91653: extending task lists for all hosts with included blocks 28023 1726853616.92083: done extending task lists 28023 1726853616.92084: done processing included files 28023 1726853616.92084: results queue empty 28023 1726853616.92085: checking for any_errors_fatal 28023 1726853616.92087: done checking for any_errors_fatal 28023 1726853616.92087: checking for max_fail_percentage 28023 1726853616.92088: done checking for max_fail_percentage 28023 1726853616.92088: checking to see if all hosts have failed and the running result is not ok 28023 1726853616.92089: done checking to see if all hosts have failed 28023 1726853616.92089: getting the remaining hosts for this loop 28023 1726853616.92090: done getting the remaining hosts for this loop 28023 1726853616.92092: getting the next task for host managed_node3 28023 1726853616.92094: done getting next task for host managed_node3 28023 1726853616.92095: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 28023 1726853616.92097: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853616.92099: getting variables 28023 1726853616.92099: in VariableManager get_vars() 28023 1726853616.92109: Calling all_inventory to load vars for managed_node3 28023 1726853616.92111: Calling groups_inventory to load vars for managed_node3 28023 1726853616.92112: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853616.92116: Calling all_plugins_play to load vars for managed_node3 28023 1726853616.92117: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853616.92119: Calling groups_plugins_play to load vars for managed_node3 28023 1726853616.92207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853616.92327: done with get_vars() 28023 1726853616.92334: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:33:36 -0400 (0:00:00.030) 0:00:09.007 ****** 28023 1726853616.92387: entering _queue_task() for managed_node3/include_tasks 28023 1726853616.92614: worker is 1 (out of 1 available) 28023 1726853616.92627: exiting _queue_task() for managed_node3/include_tasks 28023 1726853616.92642: done queuing things up, now waiting for results queue to drain 28023 1726853616.92643: waiting for pending results... 28023 1726853616.92810: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 28023 1726853616.92870: in run() - task 02083763-bbaf-fdb6-dad7-000000000282 28023 1726853616.92883: variable 'ansible_search_path' from source: unknown 28023 1726853616.92887: variable 'ansible_search_path' from source: unknown 28023 1726853616.92914: calling self._execute() 28023 1726853616.92980: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853616.92985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853616.92995: variable 'omit' from source: magic vars 28023 1726853616.93334: variable 'ansible_distribution_major_version' from source: facts 28023 1726853616.93576: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853616.93580: _execute() done 28023 1726853616.93583: dumping result to json 28023 1726853616.93586: done dumping result, returning 28023 1726853616.93589: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-fdb6-dad7-000000000282] 28023 1726853616.93591: sending task result for task 02083763-bbaf-fdb6-dad7-000000000282 28023 1726853616.93650: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000282 28023 1726853616.93653: WORKER PROCESS EXITING 28023 1726853616.93697: no more pending results, returning what we have 28023 1726853616.93701: in VariableManager get_vars() 28023 1726853616.93738: Calling all_inventory to load vars for managed_node3 28023 1726853616.93740: Calling groups_inventory to load vars for managed_node3 28023 1726853616.93743: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853616.93752: Calling all_plugins_play to load vars for managed_node3 28023 1726853616.93755: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853616.93761: Calling groups_plugins_play to load vars for managed_node3 28023 1726853616.94038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853616.94368: done with get_vars() 28023 1726853616.94378: variable 'ansible_search_path' from source: unknown 28023 1726853616.94379: variable 'ansible_search_path' from source: unknown 28023 1726853616.94418: we have included files to process 28023 1726853616.94419: generating all_blocks data 28023 1726853616.94420: done generating all_blocks data 28023 1726853616.94422: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28023 1726853616.94423: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28023 1726853616.94425: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28023 1726853616.94737: done processing included file 28023 1726853616.94744: iterating over new_blocks loaded from include file 28023 1726853616.94746: in VariableManager get_vars() 28023 1726853616.94770: done with get_vars() 28023 1726853616.94774: filtering new block on tags 28023 1726853616.94791: done filtering new block on tags 28023 1726853616.94793: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 28023 1726853616.94798: extending task lists for all hosts with included blocks 28023 1726853616.94903: done extending task lists 28023 1726853616.94905: done processing included files 28023 1726853616.94906: results queue empty 28023 1726853616.94906: checking for any_errors_fatal 28023 1726853616.94909: done checking for any_errors_fatal 28023 1726853616.94910: checking for max_fail_percentage 28023 1726853616.94911: done checking for max_fail_percentage 28023 1726853616.94912: checking to see if all hosts have failed and the running result is not ok 28023 1726853616.94913: done checking to see if all hosts have failed 28023 1726853616.94913: getting the remaining hosts for this loop 28023 1726853616.94914: done getting the remaining hosts for this loop 28023 1726853616.94917: getting the next task for host managed_node3 28023 1726853616.94920: done getting next task for host managed_node3 28023 1726853616.94922: ^ task is: TASK: Gather current interface info 28023 1726853616.94925: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853616.94928: getting variables 28023 1726853616.94928: in VariableManager get_vars() 28023 1726853616.94941: Calling all_inventory to load vars for managed_node3 28023 1726853616.94943: Calling groups_inventory to load vars for managed_node3 28023 1726853616.94945: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853616.94949: Calling all_plugins_play to load vars for managed_node3 28023 1726853616.94952: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853616.94954: Calling groups_plugins_play to load vars for managed_node3 28023 1726853616.95138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853616.95407: done with get_vars() 28023 1726853616.95414: done getting variables 28023 1726853616.95443: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:33:36 -0400 (0:00:00.030) 0:00:09.038 ****** 28023 1726853616.95465: entering _queue_task() for managed_node3/command 28023 1726853616.95702: worker is 1 (out of 1 available) 28023 1726853616.95716: exiting _queue_task() for managed_node3/command 28023 1726853616.95729: done queuing things up, now waiting for results queue to drain 28023 1726853616.95730: waiting for pending results... 28023 1726853616.95904: running TaskExecutor() for managed_node3/TASK: Gather current interface info 28023 1726853616.95973: in run() - task 02083763-bbaf-fdb6-dad7-0000000002e0 28023 1726853616.95982: variable 'ansible_search_path' from source: unknown 28023 1726853616.95986: variable 'ansible_search_path' from source: unknown 28023 1726853616.96017: calling self._execute() 28023 1726853616.96089: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853616.96092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853616.96103: variable 'omit' from source: magic vars 28023 1726853616.96375: variable 'ansible_distribution_major_version' from source: facts 28023 1726853616.96386: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853616.96391: variable 'omit' from source: magic vars 28023 1726853616.96425: variable 'omit' from source: magic vars 28023 1726853616.96449: variable 'omit' from source: magic vars 28023 1726853616.96484: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853616.96514: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853616.96539: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853616.96553: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853616.96565: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853616.96590: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853616.96593: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853616.96597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853616.96667: Set connection var ansible_shell_type to sh 28023 1726853616.96676: Set connection var ansible_shell_executable to /bin/sh 28023 1726853616.96681: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853616.96686: Set connection var ansible_connection to ssh 28023 1726853616.96691: Set connection var ansible_pipelining to False 28023 1726853616.96697: Set connection var ansible_timeout to 10 28023 1726853616.96719: variable 'ansible_shell_executable' from source: unknown 28023 1726853616.96723: variable 'ansible_connection' from source: unknown 28023 1726853616.96726: variable 'ansible_module_compression' from source: unknown 28023 1726853616.96728: variable 'ansible_shell_type' from source: unknown 28023 1726853616.96730: variable 'ansible_shell_executable' from source: unknown 28023 1726853616.96733: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853616.96735: variable 'ansible_pipelining' from source: unknown 28023 1726853616.96737: variable 'ansible_timeout' from source: unknown 28023 1726853616.96739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853616.96840: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853616.96850: variable 'omit' from source: magic vars 28023 1726853616.96861: starting attempt loop 28023 1726853616.96863: running the handler 28023 1726853616.96875: _low_level_execute_command(): starting 28023 1726853616.96882: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853616.97581: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853616.97598: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853616.97616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853616.97713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853616.99403: stdout chunk (state=3): >>>/root <<< 28023 1726853616.99491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853616.99528: stderr chunk (state=3): >>><<< 28023 1726853616.99532: stdout chunk (state=3): >>><<< 28023 1726853616.99552: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853616.99567: _low_level_execute_command(): starting 28023 1726853616.99574: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853616.9955485-28497-109757116009540 `" && echo ansible-tmp-1726853616.9955485-28497-109757116009540="` echo /root/.ansible/tmp/ansible-tmp-1726853616.9955485-28497-109757116009540 `" ) && sleep 0' 28023 1726853617.00037: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853617.00040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853617.00042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28023 1726853617.00045: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853617.00055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853617.00121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853617.00124: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853617.00128: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853617.00243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853617.02152: stdout chunk (state=3): >>>ansible-tmp-1726853616.9955485-28497-109757116009540=/root/.ansible/tmp/ansible-tmp-1726853616.9955485-28497-109757116009540 <<< 28023 1726853617.02254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853617.02283: stderr chunk (state=3): >>><<< 28023 1726853617.02289: stdout chunk (state=3): >>><<< 28023 1726853617.02304: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853616.9955485-28497-109757116009540=/root/.ansible/tmp/ansible-tmp-1726853616.9955485-28497-109757116009540 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853617.02330: variable 'ansible_module_compression' from source: unknown 28023 1726853617.02377: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28023 1726853617.02407: variable 'ansible_facts' from source: unknown 28023 1726853617.02467: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853616.9955485-28497-109757116009540/AnsiballZ_command.py 28023 1726853617.02570: Sending initial data 28023 1726853617.02576: Sent initial data (156 bytes) 28023 1726853617.03151: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853617.03174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853617.03262: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853617.04900: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853617.04957: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853617.05013: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmps9flpi0q /root/.ansible/tmp/ansible-tmp-1726853616.9955485-28497-109757116009540/AnsiballZ_command.py <<< 28023 1726853617.05019: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853616.9955485-28497-109757116009540/AnsiballZ_command.py" <<< 28023 1726853617.05072: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmps9flpi0q" to remote "/root/.ansible/tmp/ansible-tmp-1726853616.9955485-28497-109757116009540/AnsiballZ_command.py" <<< 28023 1726853617.05076: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853616.9955485-28497-109757116009540/AnsiballZ_command.py" <<< 28023 1726853617.05651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853617.05692: stderr chunk (state=3): >>><<< 28023 1726853617.05697: stdout chunk (state=3): >>><<< 28023 1726853617.05736: done transferring module to remote 28023 1726853617.05744: _low_level_execute_command(): starting 28023 1726853617.05749: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853616.9955485-28497-109757116009540/ /root/.ansible/tmp/ansible-tmp-1726853616.9955485-28497-109757116009540/AnsiballZ_command.py && sleep 0' 28023 1726853617.06176: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853617.06179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853617.06182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853617.06184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853617.06247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853617.06251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853617.06316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853617.08157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853617.08182: stderr chunk (state=3): >>><<< 28023 1726853617.08186: stdout chunk (state=3): >>><<< 28023 1726853617.08201: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853617.08204: _low_level_execute_command(): starting 28023 1726853617.08209: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853616.9955485-28497-109757116009540/AnsiballZ_command.py && sleep 0' 28023 1726853617.08624: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853617.08629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853617.08632: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853617.08634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853617.08687: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853617.08690: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853617.08763: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853617.24556: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:33:37.241010", "end": "2024-09-20 13:33:37.244415", "delta": "0:00:00.003405", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28023 1726853617.26148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853617.26153: stdout chunk (state=3): >>><<< 28023 1726853617.26155: stderr chunk (state=3): >>><<< 28023 1726853617.26176: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:33:37.241010", "end": "2024-09-20 13:33:37.244415", "delta": "0:00:00.003405", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853617.26209: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853616.9955485-28497-109757116009540/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853617.26215: _low_level_execute_command(): starting 28023 1726853617.26220: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853616.9955485-28497-109757116009540/ > /dev/null 2>&1 && sleep 0' 28023 1726853617.26663: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853617.26666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853617.26668: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853617.26672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853617.26721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853617.26733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853617.26736: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853617.26789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853617.28930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853617.28951: stderr chunk (state=3): >>><<< 28023 1726853617.29046: stdout chunk (state=3): >>><<< 28023 1726853617.29050: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853617.29053: handler run complete 28023 1726853617.29055: Evaluated conditional (False): False 28023 1726853617.29057: attempt loop complete, returning result 28023 1726853617.29059: _execute() done 28023 1726853617.29061: dumping result to json 28023 1726853617.29063: done dumping result, returning 28023 1726853617.29065: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [02083763-bbaf-fdb6-dad7-0000000002e0] 28023 1726853617.29080: sending task result for task 02083763-bbaf-fdb6-dad7-0000000002e0 28023 1726853617.29306: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000002e0 28023 1726853617.29310: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003405", "end": "2024-09-20 13:33:37.244415", "rc": 0, "start": "2024-09-20 13:33:37.241010" } STDOUT: bonding_masters eth0 ethtest0 lo peerethtest0 rpltstbr 28023 1726853617.29391: no more pending results, returning what we have 28023 1726853617.29395: results queue empty 28023 1726853617.29396: checking for any_errors_fatal 28023 1726853617.29398: done checking for any_errors_fatal 28023 1726853617.29398: checking for max_fail_percentage 28023 1726853617.29400: done checking for max_fail_percentage 28023 1726853617.29401: checking to see if all hosts have failed and the running result is not ok 28023 1726853617.29402: done checking to see if all hosts have failed 28023 1726853617.29403: getting the remaining hosts for this loop 28023 1726853617.29404: done getting the remaining hosts for this loop 28023 1726853617.29408: getting the next task for host managed_node3 28023 1726853617.29416: done getting next task for host managed_node3 28023 1726853617.29418: ^ task is: TASK: Set current_interfaces 28023 1726853617.29423: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853617.29427: getting variables 28023 1726853617.29429: in VariableManager get_vars() 28023 1726853617.29595: Calling all_inventory to load vars for managed_node3 28023 1726853617.29599: Calling groups_inventory to load vars for managed_node3 28023 1726853617.29602: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853617.29614: Calling all_plugins_play to load vars for managed_node3 28023 1726853617.29617: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853617.29620: Calling groups_plugins_play to load vars for managed_node3 28023 1726853617.29939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853617.30170: done with get_vars() 28023 1726853617.30184: done getting variables 28023 1726853617.30254: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:33:37 -0400 (0:00:00.348) 0:00:09.386 ****** 28023 1726853617.30288: entering _queue_task() for managed_node3/set_fact 28023 1726853617.30668: worker is 1 (out of 1 available) 28023 1726853617.30707: exiting _queue_task() for managed_node3/set_fact 28023 1726853617.30717: done queuing things up, now waiting for results queue to drain 28023 1726853617.30718: waiting for pending results... 28023 1726853617.30945: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 28023 1726853617.31122: in run() - task 02083763-bbaf-fdb6-dad7-0000000002e1 28023 1726853617.31126: variable 'ansible_search_path' from source: unknown 28023 1726853617.31129: variable 'ansible_search_path' from source: unknown 28023 1726853617.31142: calling self._execute() 28023 1726853617.31243: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853617.31247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853617.31255: variable 'omit' from source: magic vars 28023 1726853617.31576: variable 'ansible_distribution_major_version' from source: facts 28023 1726853617.31586: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853617.31592: variable 'omit' from source: magic vars 28023 1726853617.31622: variable 'omit' from source: magic vars 28023 1726853617.31696: variable '_current_interfaces' from source: set_fact 28023 1726853617.31744: variable 'omit' from source: magic vars 28023 1726853617.31777: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853617.31804: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853617.31819: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853617.31832: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853617.31843: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853617.31867: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853617.31870: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853617.31874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853617.31938: Set connection var ansible_shell_type to sh 28023 1726853617.31944: Set connection var ansible_shell_executable to /bin/sh 28023 1726853617.31949: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853617.31962: Set connection var ansible_connection to ssh 28023 1726853617.31965: Set connection var ansible_pipelining to False 28023 1726853617.31967: Set connection var ansible_timeout to 10 28023 1726853617.31989: variable 'ansible_shell_executable' from source: unknown 28023 1726853617.31993: variable 'ansible_connection' from source: unknown 28023 1726853617.31995: variable 'ansible_module_compression' from source: unknown 28023 1726853617.31997: variable 'ansible_shell_type' from source: unknown 28023 1726853617.31999: variable 'ansible_shell_executable' from source: unknown 28023 1726853617.32001: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853617.32003: variable 'ansible_pipelining' from source: unknown 28023 1726853617.32005: variable 'ansible_timeout' from source: unknown 28023 1726853617.32008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853617.32111: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853617.32119: variable 'omit' from source: magic vars 28023 1726853617.32124: starting attempt loop 28023 1726853617.32127: running the handler 28023 1726853617.32135: handler run complete 28023 1726853617.32143: attempt loop complete, returning result 28023 1726853617.32146: _execute() done 28023 1726853617.32148: dumping result to json 28023 1726853617.32153: done dumping result, returning 28023 1726853617.32161: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [02083763-bbaf-fdb6-dad7-0000000002e1] 28023 1726853617.32164: sending task result for task 02083763-bbaf-fdb6-dad7-0000000002e1 28023 1726853617.32241: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000002e1 28023 1726853617.32244: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "ethtest0", "lo", "peerethtest0", "rpltstbr" ] }, "changed": false } 28023 1726853617.32332: no more pending results, returning what we have 28023 1726853617.32335: results queue empty 28023 1726853617.32336: checking for any_errors_fatal 28023 1726853617.32342: done checking for any_errors_fatal 28023 1726853617.32342: checking for max_fail_percentage 28023 1726853617.32344: done checking for max_fail_percentage 28023 1726853617.32344: checking to see if all hosts have failed and the running result is not ok 28023 1726853617.32345: done checking to see if all hosts have failed 28023 1726853617.32346: getting the remaining hosts for this loop 28023 1726853617.32348: done getting the remaining hosts for this loop 28023 1726853617.32351: getting the next task for host managed_node3 28023 1726853617.32361: done getting next task for host managed_node3 28023 1726853617.32364: ^ task is: TASK: Show current_interfaces 28023 1726853617.32366: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853617.32370: getting variables 28023 1726853617.32372: in VariableManager get_vars() 28023 1726853617.32406: Calling all_inventory to load vars for managed_node3 28023 1726853617.32408: Calling groups_inventory to load vars for managed_node3 28023 1726853617.32410: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853617.32419: Calling all_plugins_play to load vars for managed_node3 28023 1726853617.32422: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853617.32424: Calling groups_plugins_play to load vars for managed_node3 28023 1726853617.32576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853617.32699: done with get_vars() 28023 1726853617.32706: done getting variables 28023 1726853617.32744: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:33:37 -0400 (0:00:00.024) 0:00:09.411 ****** 28023 1726853617.32767: entering _queue_task() for managed_node3/debug 28023 1726853617.32973: worker is 1 (out of 1 available) 28023 1726853617.32986: exiting _queue_task() for managed_node3/debug 28023 1726853617.32997: done queuing things up, now waiting for results queue to drain 28023 1726853617.32998: waiting for pending results... 28023 1726853617.33291: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 28023 1726853617.33295: in run() - task 02083763-bbaf-fdb6-dad7-000000000283 28023 1726853617.33299: variable 'ansible_search_path' from source: unknown 28023 1726853617.33302: variable 'ansible_search_path' from source: unknown 28023 1726853617.33423: calling self._execute() 28023 1726853617.33443: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853617.33454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853617.33466: variable 'omit' from source: magic vars 28023 1726853617.33819: variable 'ansible_distribution_major_version' from source: facts 28023 1726853617.33838: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853617.33856: variable 'omit' from source: magic vars 28023 1726853617.33965: variable 'omit' from source: magic vars 28023 1726853617.34005: variable 'current_interfaces' from source: set_fact 28023 1726853617.34042: variable 'omit' from source: magic vars 28023 1726853617.34094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853617.34139: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853617.34163: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853617.34179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853617.34190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853617.34212: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853617.34215: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853617.34219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853617.34292: Set connection var ansible_shell_type to sh 28023 1726853617.34301: Set connection var ansible_shell_executable to /bin/sh 28023 1726853617.34310: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853617.34316: Set connection var ansible_connection to ssh 28023 1726853617.34321: Set connection var ansible_pipelining to False 28023 1726853617.34326: Set connection var ansible_timeout to 10 28023 1726853617.34345: variable 'ansible_shell_executable' from source: unknown 28023 1726853617.34349: variable 'ansible_connection' from source: unknown 28023 1726853617.34352: variable 'ansible_module_compression' from source: unknown 28023 1726853617.34355: variable 'ansible_shell_type' from source: unknown 28023 1726853617.34357: variable 'ansible_shell_executable' from source: unknown 28023 1726853617.34359: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853617.34364: variable 'ansible_pipelining' from source: unknown 28023 1726853617.34366: variable 'ansible_timeout' from source: unknown 28023 1726853617.34372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853617.34470: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853617.34479: variable 'omit' from source: magic vars 28023 1726853617.34484: starting attempt loop 28023 1726853617.34487: running the handler 28023 1726853617.34555: handler run complete 28023 1726853617.34558: attempt loop complete, returning result 28023 1726853617.34561: _execute() done 28023 1726853617.34563: dumping result to json 28023 1726853617.34565: done dumping result, returning 28023 1726853617.34567: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [02083763-bbaf-fdb6-dad7-000000000283] 28023 1726853617.34570: sending task result for task 02083763-bbaf-fdb6-dad7-000000000283 28023 1726853617.34734: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000283 28023 1726853617.34737: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'ethtest0', 'lo', 'peerethtest0', 'rpltstbr'] 28023 1726853617.34789: no more pending results, returning what we have 28023 1726853617.34792: results queue empty 28023 1726853617.34793: checking for any_errors_fatal 28023 1726853617.34799: done checking for any_errors_fatal 28023 1726853617.34800: checking for max_fail_percentage 28023 1726853617.34801: done checking for max_fail_percentage 28023 1726853617.34802: checking to see if all hosts have failed and the running result is not ok 28023 1726853617.34803: done checking to see if all hosts have failed 28023 1726853617.34804: getting the remaining hosts for this loop 28023 1726853617.34806: done getting the remaining hosts for this loop 28023 1726853617.34810: getting the next task for host managed_node3 28023 1726853617.34818: done getting next task for host managed_node3 28023 1726853617.34821: ^ task is: TASK: Manage test interface 28023 1726853617.34824: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853617.34827: getting variables 28023 1726853617.34829: in VariableManager get_vars() 28023 1726853617.34874: Calling all_inventory to load vars for managed_node3 28023 1726853617.34877: Calling groups_inventory to load vars for managed_node3 28023 1726853617.34882: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853617.34894: Calling all_plugins_play to load vars for managed_node3 28023 1726853617.34897: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853617.34900: Calling groups_plugins_play to load vars for managed_node3 28023 1726853617.35255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853617.35474: done with get_vars() 28023 1726853617.35484: done getting variables TASK [Manage test interface] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:28 Friday 20 September 2024 13:33:37 -0400 (0:00:00.027) 0:00:09.439 ****** 28023 1726853617.35570: entering _queue_task() for managed_node3/include_tasks 28023 1726853617.35804: worker is 1 (out of 1 available) 28023 1726853617.35819: exiting _queue_task() for managed_node3/include_tasks 28023 1726853617.35831: done queuing things up, now waiting for results queue to drain 28023 1726853617.35832: waiting for pending results... 28023 1726853617.35999: running TaskExecutor() for managed_node3/TASK: Manage test interface 28023 1726853617.36054: in run() - task 02083763-bbaf-fdb6-dad7-000000000011 28023 1726853617.36070: variable 'ansible_search_path' from source: unknown 28023 1726853617.36103: calling self._execute() 28023 1726853617.36182: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853617.36187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853617.36197: variable 'omit' from source: magic vars 28023 1726853617.36706: variable 'ansible_distribution_major_version' from source: facts 28023 1726853617.36710: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853617.36713: _execute() done 28023 1726853617.36715: dumping result to json 28023 1726853617.36717: done dumping result, returning 28023 1726853617.36720: done running TaskExecutor() for managed_node3/TASK: Manage test interface [02083763-bbaf-fdb6-dad7-000000000011] 28023 1726853617.36722: sending task result for task 02083763-bbaf-fdb6-dad7-000000000011 28023 1726853617.36858: no more pending results, returning what we have 28023 1726853617.36863: in VariableManager get_vars() 28023 1726853617.36913: Calling all_inventory to load vars for managed_node3 28023 1726853617.36917: Calling groups_inventory to load vars for managed_node3 28023 1726853617.36919: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853617.36934: Calling all_plugins_play to load vars for managed_node3 28023 1726853617.36937: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853617.36940: Calling groups_plugins_play to load vars for managed_node3 28023 1726853617.37376: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000011 28023 1726853617.37379: WORKER PROCESS EXITING 28023 1726853617.37409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853617.37669: done with get_vars() 28023 1726853617.37681: variable 'ansible_search_path' from source: unknown 28023 1726853617.37694: we have included files to process 28023 1726853617.37695: generating all_blocks data 28023 1726853617.37696: done generating all_blocks data 28023 1726853617.37700: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 28023 1726853617.37701: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 28023 1726853617.37703: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 28023 1726853617.38380: in VariableManager get_vars() 28023 1726853617.38406: done with get_vars() 28023 1726853617.39055: done processing included file 28023 1726853617.39057: iterating over new_blocks loaded from include file 28023 1726853617.39059: in VariableManager get_vars() 28023 1726853617.39078: done with get_vars() 28023 1726853617.39080: filtering new block on tags 28023 1726853617.39110: done filtering new block on tags 28023 1726853617.39113: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 28023 1726853617.39117: extending task lists for all hosts with included blocks 28023 1726853617.39950: done extending task lists 28023 1726853617.39952: done processing included files 28023 1726853617.39953: results queue empty 28023 1726853617.39953: checking for any_errors_fatal 28023 1726853617.39956: done checking for any_errors_fatal 28023 1726853617.39957: checking for max_fail_percentage 28023 1726853617.39958: done checking for max_fail_percentage 28023 1726853617.39959: checking to see if all hosts have failed and the running result is not ok 28023 1726853617.39960: done checking to see if all hosts have failed 28023 1726853617.39961: getting the remaining hosts for this loop 28023 1726853617.39962: done getting the remaining hosts for this loop 28023 1726853617.39964: getting the next task for host managed_node3 28023 1726853617.39968: done getting next task for host managed_node3 28023 1726853617.39973: ^ task is: TASK: Ensure state in ["present", "absent"] 28023 1726853617.39975: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853617.39978: getting variables 28023 1726853617.39979: in VariableManager get_vars() 28023 1726853617.39990: Calling all_inventory to load vars for managed_node3 28023 1726853617.39992: Calling groups_inventory to load vars for managed_node3 28023 1726853617.39994: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853617.39999: Calling all_plugins_play to load vars for managed_node3 28023 1726853617.40001: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853617.40004: Calling groups_plugins_play to load vars for managed_node3 28023 1726853617.40164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853617.40594: done with get_vars() 28023 1726853617.40603: done getting variables 28023 1726853617.40640: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 13:33:37 -0400 (0:00:00.050) 0:00:09.490 ****** 28023 1726853617.40666: entering _queue_task() for managed_node3/fail 28023 1726853617.40952: worker is 1 (out of 1 available) 28023 1726853617.40964: exiting _queue_task() for managed_node3/fail 28023 1726853617.40981: done queuing things up, now waiting for results queue to drain 28023 1726853617.40982: waiting for pending results... 28023 1726853617.41251: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 28023 1726853617.41380: in run() - task 02083763-bbaf-fdb6-dad7-0000000002fc 28023 1726853617.41401: variable 'ansible_search_path' from source: unknown 28023 1726853617.41410: variable 'ansible_search_path' from source: unknown 28023 1726853617.41449: calling self._execute() 28023 1726853617.41542: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853617.41595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853617.41610: variable 'omit' from source: magic vars 28023 1726853617.42476: variable 'ansible_distribution_major_version' from source: facts 28023 1726853617.42480: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853617.42582: variable 'state' from source: include params 28023 1726853617.42684: Evaluated conditional (state not in ["present", "absent"]): False 28023 1726853617.42693: when evaluation is False, skipping this task 28023 1726853617.42701: _execute() done 28023 1726853617.42709: dumping result to json 28023 1726853617.42717: done dumping result, returning 28023 1726853617.42726: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [02083763-bbaf-fdb6-dad7-0000000002fc] 28023 1726853617.42779: sending task result for task 02083763-bbaf-fdb6-dad7-0000000002fc skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 28023 1726853617.42913: no more pending results, returning what we have 28023 1726853617.42917: results queue empty 28023 1726853617.42918: checking for any_errors_fatal 28023 1726853617.42919: done checking for any_errors_fatal 28023 1726853617.42920: checking for max_fail_percentage 28023 1726853617.42921: done checking for max_fail_percentage 28023 1726853617.42922: checking to see if all hosts have failed and the running result is not ok 28023 1726853617.42923: done checking to see if all hosts have failed 28023 1726853617.42924: getting the remaining hosts for this loop 28023 1726853617.42925: done getting the remaining hosts for this loop 28023 1726853617.42928: getting the next task for host managed_node3 28023 1726853617.42934: done getting next task for host managed_node3 28023 1726853617.42937: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 28023 1726853617.42941: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853617.42944: getting variables 28023 1726853617.42946: in VariableManager get_vars() 28023 1726853617.42988: Calling all_inventory to load vars for managed_node3 28023 1726853617.42990: Calling groups_inventory to load vars for managed_node3 28023 1726853617.42993: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853617.43007: Calling all_plugins_play to load vars for managed_node3 28023 1726853617.43011: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853617.43013: Calling groups_plugins_play to load vars for managed_node3 28023 1726853617.43348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853617.43764: done with get_vars() 28023 1726853617.43777: done getting variables 28023 1726853617.43809: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000002fc 28023 1726853617.43813: WORKER PROCESS EXITING 28023 1726853617.43848: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 13:33:37 -0400 (0:00:00.032) 0:00:09.522 ****** 28023 1726853617.43880: entering _queue_task() for managed_node3/fail 28023 1726853617.44135: worker is 1 (out of 1 available) 28023 1726853617.44148: exiting _queue_task() for managed_node3/fail 28023 1726853617.44162: done queuing things up, now waiting for results queue to drain 28023 1726853617.44163: waiting for pending results... 28023 1726853617.44431: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 28023 1726853617.44540: in run() - task 02083763-bbaf-fdb6-dad7-0000000002fd 28023 1726853617.44560: variable 'ansible_search_path' from source: unknown 28023 1726853617.44569: variable 'ansible_search_path' from source: unknown 28023 1726853617.44619: calling self._execute() 28023 1726853617.44722: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853617.44733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853617.44748: variable 'omit' from source: magic vars 28023 1726853617.45123: variable 'ansible_distribution_major_version' from source: facts 28023 1726853617.45143: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853617.45345: variable 'type' from source: set_fact 28023 1726853617.45357: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 28023 1726853617.45364: when evaluation is False, skipping this task 28023 1726853617.45374: _execute() done 28023 1726853617.45383: dumping result to json 28023 1726853617.45391: done dumping result, returning 28023 1726853617.45402: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [02083763-bbaf-fdb6-dad7-0000000002fd] 28023 1726853617.45416: sending task result for task 02083763-bbaf-fdb6-dad7-0000000002fd skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 28023 1726853617.45565: no more pending results, returning what we have 28023 1726853617.45569: results queue empty 28023 1726853617.45573: checking for any_errors_fatal 28023 1726853617.45579: done checking for any_errors_fatal 28023 1726853617.45580: checking for max_fail_percentage 28023 1726853617.45582: done checking for max_fail_percentage 28023 1726853617.45583: checking to see if all hosts have failed and the running result is not ok 28023 1726853617.45584: done checking to see if all hosts have failed 28023 1726853617.45585: getting the remaining hosts for this loop 28023 1726853617.45586: done getting the remaining hosts for this loop 28023 1726853617.45590: getting the next task for host managed_node3 28023 1726853617.45597: done getting next task for host managed_node3 28023 1726853617.45600: ^ task is: TASK: Include the task 'show_interfaces.yml' 28023 1726853617.45604: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853617.45608: getting variables 28023 1726853617.45610: in VariableManager get_vars() 28023 1726853617.45652: Calling all_inventory to load vars for managed_node3 28023 1726853617.45655: Calling groups_inventory to load vars for managed_node3 28023 1726853617.45658: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853617.46134: Calling all_plugins_play to load vars for managed_node3 28023 1726853617.46139: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853617.46143: Calling groups_plugins_play to load vars for managed_node3 28023 1726853617.46591: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000002fd 28023 1726853617.46594: WORKER PROCESS EXITING 28023 1726853617.46601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853617.47016: done with get_vars() 28023 1726853617.47028: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 13:33:37 -0400 (0:00:00.032) 0:00:09.555 ****** 28023 1726853617.47211: entering _queue_task() for managed_node3/include_tasks 28023 1726853617.47589: worker is 1 (out of 1 available) 28023 1726853617.47600: exiting _queue_task() for managed_node3/include_tasks 28023 1726853617.47611: done queuing things up, now waiting for results queue to drain 28023 1726853617.47613: waiting for pending results... 28023 1726853617.47789: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 28023 1726853617.47901: in run() - task 02083763-bbaf-fdb6-dad7-0000000002fe 28023 1726853617.47922: variable 'ansible_search_path' from source: unknown 28023 1726853617.47931: variable 'ansible_search_path' from source: unknown 28023 1726853617.47973: calling self._execute() 28023 1726853617.48074: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853617.48086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853617.48100: variable 'omit' from source: magic vars 28023 1726853617.48472: variable 'ansible_distribution_major_version' from source: facts 28023 1726853617.48491: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853617.48502: _execute() done 28023 1726853617.48512: dumping result to json 28023 1726853617.48520: done dumping result, returning 28023 1726853617.48530: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-fdb6-dad7-0000000002fe] 28023 1726853617.48540: sending task result for task 02083763-bbaf-fdb6-dad7-0000000002fe 28023 1726853617.48648: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000002fe 28023 1726853617.48655: WORKER PROCESS EXITING 28023 1726853617.48693: no more pending results, returning what we have 28023 1726853617.48698: in VariableManager get_vars() 28023 1726853617.48746: Calling all_inventory to load vars for managed_node3 28023 1726853617.48750: Calling groups_inventory to load vars for managed_node3 28023 1726853617.48752: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853617.48767: Calling all_plugins_play to load vars for managed_node3 28023 1726853617.48772: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853617.48776: Calling groups_plugins_play to load vars for managed_node3 28023 1726853617.49095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853617.49446: done with get_vars() 28023 1726853617.49454: variable 'ansible_search_path' from source: unknown 28023 1726853617.49456: variable 'ansible_search_path' from source: unknown 28023 1726853617.49491: we have included files to process 28023 1726853617.49492: generating all_blocks data 28023 1726853617.49494: done generating all_blocks data 28023 1726853617.49499: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28023 1726853617.49500: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28023 1726853617.49502: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28023 1726853617.49603: in VariableManager get_vars() 28023 1726853617.49627: done with get_vars() 28023 1726853617.49736: done processing included file 28023 1726853617.49738: iterating over new_blocks loaded from include file 28023 1726853617.49739: in VariableManager get_vars() 28023 1726853617.49757: done with get_vars() 28023 1726853617.49758: filtering new block on tags 28023 1726853617.49778: done filtering new block on tags 28023 1726853617.49780: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 28023 1726853617.49785: extending task lists for all hosts with included blocks 28023 1726853617.50225: done extending task lists 28023 1726853617.50227: done processing included files 28023 1726853617.50227: results queue empty 28023 1726853617.50228: checking for any_errors_fatal 28023 1726853617.50232: done checking for any_errors_fatal 28023 1726853617.50233: checking for max_fail_percentage 28023 1726853617.50234: done checking for max_fail_percentage 28023 1726853617.50235: checking to see if all hosts have failed and the running result is not ok 28023 1726853617.50236: done checking to see if all hosts have failed 28023 1726853617.50237: getting the remaining hosts for this loop 28023 1726853617.50238: done getting the remaining hosts for this loop 28023 1726853617.50240: getting the next task for host managed_node3 28023 1726853617.50244: done getting next task for host managed_node3 28023 1726853617.50247: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 28023 1726853617.50250: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853617.50253: getting variables 28023 1726853617.50254: in VariableManager get_vars() 28023 1726853617.50267: Calling all_inventory to load vars for managed_node3 28023 1726853617.50269: Calling groups_inventory to load vars for managed_node3 28023 1726853617.50274: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853617.50279: Calling all_plugins_play to load vars for managed_node3 28023 1726853617.50282: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853617.50284: Calling groups_plugins_play to load vars for managed_node3 28023 1726853617.50430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853617.50639: done with get_vars() 28023 1726853617.50649: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:33:37 -0400 (0:00:00.035) 0:00:09.591 ****** 28023 1726853617.50721: entering _queue_task() for managed_node3/include_tasks 28023 1726853617.51013: worker is 1 (out of 1 available) 28023 1726853617.51025: exiting _queue_task() for managed_node3/include_tasks 28023 1726853617.51040: done queuing things up, now waiting for results queue to drain 28023 1726853617.51041: waiting for pending results... 28023 1726853617.51289: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 28023 1726853617.51362: in run() - task 02083763-bbaf-fdb6-dad7-000000000374 28023 1726853617.51376: variable 'ansible_search_path' from source: unknown 28023 1726853617.51380: variable 'ansible_search_path' from source: unknown 28023 1726853617.51407: calling self._execute() 28023 1726853617.51484: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853617.51489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853617.51493: variable 'omit' from source: magic vars 28023 1726853617.51749: variable 'ansible_distribution_major_version' from source: facts 28023 1726853617.51765: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853617.51769: _execute() done 28023 1726853617.51773: dumping result to json 28023 1726853617.51777: done dumping result, returning 28023 1726853617.51780: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-fdb6-dad7-000000000374] 28023 1726853617.51782: sending task result for task 02083763-bbaf-fdb6-dad7-000000000374 28023 1726853617.51866: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000374 28023 1726853617.51872: WORKER PROCESS EXITING 28023 1726853617.51899: no more pending results, returning what we have 28023 1726853617.51904: in VariableManager get_vars() 28023 1726853617.51948: Calling all_inventory to load vars for managed_node3 28023 1726853617.51952: Calling groups_inventory to load vars for managed_node3 28023 1726853617.51954: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853617.51970: Calling all_plugins_play to load vars for managed_node3 28023 1726853617.51978: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853617.51985: Calling groups_plugins_play to load vars for managed_node3 28023 1726853617.52155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853617.52287: done with get_vars() 28023 1726853617.52293: variable 'ansible_search_path' from source: unknown 28023 1726853617.52294: variable 'ansible_search_path' from source: unknown 28023 1726853617.52331: we have included files to process 28023 1726853617.52332: generating all_blocks data 28023 1726853617.52333: done generating all_blocks data 28023 1726853617.52334: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28023 1726853617.52335: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28023 1726853617.52336: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28023 1726853617.52510: done processing included file 28023 1726853617.52511: iterating over new_blocks loaded from include file 28023 1726853617.52512: in VariableManager get_vars() 28023 1726853617.52526: done with get_vars() 28023 1726853617.52527: filtering new block on tags 28023 1726853617.52539: done filtering new block on tags 28023 1726853617.52540: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 28023 1726853617.52543: extending task lists for all hosts with included blocks 28023 1726853617.52629: done extending task lists 28023 1726853617.52630: done processing included files 28023 1726853617.52630: results queue empty 28023 1726853617.52631: checking for any_errors_fatal 28023 1726853617.52634: done checking for any_errors_fatal 28023 1726853617.52635: checking for max_fail_percentage 28023 1726853617.52636: done checking for max_fail_percentage 28023 1726853617.52636: checking to see if all hosts have failed and the running result is not ok 28023 1726853617.52637: done checking to see if all hosts have failed 28023 1726853617.52637: getting the remaining hosts for this loop 28023 1726853617.52638: done getting the remaining hosts for this loop 28023 1726853617.52640: getting the next task for host managed_node3 28023 1726853617.52643: done getting next task for host managed_node3 28023 1726853617.52644: ^ task is: TASK: Gather current interface info 28023 1726853617.52646: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853617.52648: getting variables 28023 1726853617.52648: in VariableManager get_vars() 28023 1726853617.52659: Calling all_inventory to load vars for managed_node3 28023 1726853617.52660: Calling groups_inventory to load vars for managed_node3 28023 1726853617.52661: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853617.52665: Calling all_plugins_play to load vars for managed_node3 28023 1726853617.52666: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853617.52668: Calling groups_plugins_play to load vars for managed_node3 28023 1726853617.52779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853617.52959: done with get_vars() 28023 1726853617.52968: done getting variables 28023 1726853617.53012: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:33:37 -0400 (0:00:00.023) 0:00:09.614 ****** 28023 1726853617.53044: entering _queue_task() for managed_node3/command 28023 1726853617.53341: worker is 1 (out of 1 available) 28023 1726853617.53354: exiting _queue_task() for managed_node3/command 28023 1726853617.53367: done queuing things up, now waiting for results queue to drain 28023 1726853617.53369: waiting for pending results... 28023 1726853617.53687: running TaskExecutor() for managed_node3/TASK: Gather current interface info 28023 1726853617.53692: in run() - task 02083763-bbaf-fdb6-dad7-0000000003ab 28023 1726853617.53699: variable 'ansible_search_path' from source: unknown 28023 1726853617.53779: variable 'ansible_search_path' from source: unknown 28023 1726853617.53829: calling self._execute() 28023 1726853617.53934: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853617.53947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853617.53954: variable 'omit' from source: magic vars 28023 1726853617.54295: variable 'ansible_distribution_major_version' from source: facts 28023 1726853617.54306: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853617.54311: variable 'omit' from source: magic vars 28023 1726853617.54345: variable 'omit' from source: magic vars 28023 1726853617.54373: variable 'omit' from source: magic vars 28023 1726853617.54405: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853617.54433: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853617.54450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853617.54466: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853617.54477: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853617.54501: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853617.54504: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853617.54507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853617.54575: Set connection var ansible_shell_type to sh 28023 1726853617.54582: Set connection var ansible_shell_executable to /bin/sh 28023 1726853617.54587: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853617.54592: Set connection var ansible_connection to ssh 28023 1726853617.54597: Set connection var ansible_pipelining to False 28023 1726853617.54602: Set connection var ansible_timeout to 10 28023 1726853617.54623: variable 'ansible_shell_executable' from source: unknown 28023 1726853617.54628: variable 'ansible_connection' from source: unknown 28023 1726853617.54631: variable 'ansible_module_compression' from source: unknown 28023 1726853617.54634: variable 'ansible_shell_type' from source: unknown 28023 1726853617.54637: variable 'ansible_shell_executable' from source: unknown 28023 1726853617.54639: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853617.54641: variable 'ansible_pipelining' from source: unknown 28023 1726853617.54643: variable 'ansible_timeout' from source: unknown 28023 1726853617.54645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853617.54747: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853617.54756: variable 'omit' from source: magic vars 28023 1726853617.54776: starting attempt loop 28023 1726853617.54780: running the handler 28023 1726853617.54782: _low_level_execute_command(): starting 28023 1726853617.54787: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853617.55281: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853617.55285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853617.55314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 28023 1726853617.55319: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853617.55374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853617.55378: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853617.55382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853617.55448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853617.57166: stdout chunk (state=3): >>>/root <<< 28023 1726853617.57258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853617.57294: stderr chunk (state=3): >>><<< 28023 1726853617.57297: stdout chunk (state=3): >>><<< 28023 1726853617.57311: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853617.57384: _low_level_execute_command(): starting 28023 1726853617.57388: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853617.5731604-28529-172164874776259 `" && echo ansible-tmp-1726853617.5731604-28529-172164874776259="` echo /root/.ansible/tmp/ansible-tmp-1726853617.5731604-28529-172164874776259 `" ) && sleep 0' 28023 1726853617.57737: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853617.57752: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853617.57770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853617.57817: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853617.57822: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853617.57893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853617.59891: stdout chunk (state=3): >>>ansible-tmp-1726853617.5731604-28529-172164874776259=/root/.ansible/tmp/ansible-tmp-1726853617.5731604-28529-172164874776259 <<< 28023 1726853617.60000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853617.60029: stderr chunk (state=3): >>><<< 28023 1726853617.60031: stdout chunk (state=3): >>><<< 28023 1726853617.60046: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853617.5731604-28529-172164874776259=/root/.ansible/tmp/ansible-tmp-1726853617.5731604-28529-172164874776259 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853617.60080: variable 'ansible_module_compression' from source: unknown 28023 1726853617.60122: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28023 1726853617.60154: variable 'ansible_facts' from source: unknown 28023 1726853617.60212: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853617.5731604-28529-172164874776259/AnsiballZ_command.py 28023 1726853617.60578: Sending initial data 28023 1726853617.60581: Sent initial data (156 bytes) 28023 1726853617.60986: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853617.61005: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28023 1726853617.61038: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853617.61125: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853617.61165: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853617.61265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853617.62963: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853617.63081: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853617.63109: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpfsmid0wf /root/.ansible/tmp/ansible-tmp-1726853617.5731604-28529-172164874776259/AnsiballZ_command.py <<< 28023 1726853617.63134: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853617.5731604-28529-172164874776259/AnsiballZ_command.py" <<< 28023 1726853617.63204: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpfsmid0wf" to remote "/root/.ansible/tmp/ansible-tmp-1726853617.5731604-28529-172164874776259/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853617.5731604-28529-172164874776259/AnsiballZ_command.py" <<< 28023 1726853617.64219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853617.64223: stdout chunk (state=3): >>><<< 28023 1726853617.64227: stderr chunk (state=3): >>><<< 28023 1726853617.64229: done transferring module to remote 28023 1726853617.64232: _low_level_execute_command(): starting 28023 1726853617.64235: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853617.5731604-28529-172164874776259/ /root/.ansible/tmp/ansible-tmp-1726853617.5731604-28529-172164874776259/AnsiballZ_command.py && sleep 0' 28023 1726853617.64601: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853617.64614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853617.64627: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853617.64676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853617.64692: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853617.64754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853617.66694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853617.66698: stdout chunk (state=3): >>><<< 28023 1726853617.66700: stderr chunk (state=3): >>><<< 28023 1726853617.66718: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853617.66727: _low_level_execute_command(): starting 28023 1726853617.66809: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853617.5731604-28529-172164874776259/AnsiballZ_command.py && sleep 0' 28023 1726853617.67392: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853617.67422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853617.67439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853617.67464: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853617.67549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853617.83428: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:33:37.829623", "end": "2024-09-20 13:33:37.833020", "delta": "0:00:00.003397", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28023 1726853617.85117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853617.85130: stdout chunk (state=3): >>><<< 28023 1726853617.85141: stderr chunk (state=3): >>><<< 28023 1726853617.85281: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:33:37.829623", "end": "2024-09-20 13:33:37.833020", "delta": "0:00:00.003397", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853617.85286: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853617.5731604-28529-172164874776259/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853617.85288: _low_level_execute_command(): starting 28023 1726853617.85291: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853617.5731604-28529-172164874776259/ > /dev/null 2>&1 && sleep 0' 28023 1726853617.85779: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853617.85801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853617.85821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853617.85884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853617.85891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853617.85893: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853617.85948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853617.87816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853617.88035: stderr chunk (state=3): >>><<< 28023 1726853617.88039: stdout chunk (state=3): >>><<< 28023 1726853617.88042: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853617.88045: handler run complete 28023 1726853617.88047: Evaluated conditional (False): False 28023 1726853617.88049: attempt loop complete, returning result 28023 1726853617.88052: _execute() done 28023 1726853617.88054: dumping result to json 28023 1726853617.88058: done dumping result, returning 28023 1726853617.88061: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [02083763-bbaf-fdb6-dad7-0000000003ab] 28023 1726853617.88063: sending task result for task 02083763-bbaf-fdb6-dad7-0000000003ab 28023 1726853617.88142: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000003ab 28023 1726853617.88146: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003397", "end": "2024-09-20 13:33:37.833020", "rc": 0, "start": "2024-09-20 13:33:37.829623" } STDOUT: bonding_masters eth0 ethtest0 lo peerethtest0 rpltstbr 28023 1726853617.88391: no more pending results, returning what we have 28023 1726853617.88395: results queue empty 28023 1726853617.88396: checking for any_errors_fatal 28023 1726853617.88397: done checking for any_errors_fatal 28023 1726853617.88397: checking for max_fail_percentage 28023 1726853617.88399: done checking for max_fail_percentage 28023 1726853617.88400: checking to see if all hosts have failed and the running result is not ok 28023 1726853617.88400: done checking to see if all hosts have failed 28023 1726853617.88401: getting the remaining hosts for this loop 28023 1726853617.88403: done getting the remaining hosts for this loop 28023 1726853617.88406: getting the next task for host managed_node3 28023 1726853617.88412: done getting next task for host managed_node3 28023 1726853617.88415: ^ task is: TASK: Set current_interfaces 28023 1726853617.88420: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853617.88424: getting variables 28023 1726853617.88425: in VariableManager get_vars() 28023 1726853617.88459: Calling all_inventory to load vars for managed_node3 28023 1726853617.88462: Calling groups_inventory to load vars for managed_node3 28023 1726853617.88465: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853617.88485: Calling all_plugins_play to load vars for managed_node3 28023 1726853617.88488: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853617.88490: Calling groups_plugins_play to load vars for managed_node3 28023 1726853617.88627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853617.88754: done with get_vars() 28023 1726853617.88764: done getting variables 28023 1726853617.88809: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:33:37 -0400 (0:00:00.357) 0:00:09.972 ****** 28023 1726853617.88831: entering _queue_task() for managed_node3/set_fact 28023 1726853617.89046: worker is 1 (out of 1 available) 28023 1726853617.89061: exiting _queue_task() for managed_node3/set_fact 28023 1726853617.89077: done queuing things up, now waiting for results queue to drain 28023 1726853617.89079: waiting for pending results... 28023 1726853617.89231: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 28023 1726853617.89309: in run() - task 02083763-bbaf-fdb6-dad7-0000000003ac 28023 1726853617.89318: variable 'ansible_search_path' from source: unknown 28023 1726853617.89322: variable 'ansible_search_path' from source: unknown 28023 1726853617.89350: calling self._execute() 28023 1726853617.89423: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853617.89426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853617.89436: variable 'omit' from source: magic vars 28023 1726853617.89692: variable 'ansible_distribution_major_version' from source: facts 28023 1726853617.89702: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853617.89708: variable 'omit' from source: magic vars 28023 1726853617.89743: variable 'omit' from source: magic vars 28023 1726853617.89817: variable '_current_interfaces' from source: set_fact 28023 1726853617.89870: variable 'omit' from source: magic vars 28023 1726853617.89904: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853617.89929: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853617.89945: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853617.89962: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853617.89973: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853617.89996: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853617.89999: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853617.90001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853617.90066: Set connection var ansible_shell_type to sh 28023 1726853617.90069: Set connection var ansible_shell_executable to /bin/sh 28023 1726853617.90077: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853617.90084: Set connection var ansible_connection to ssh 28023 1726853617.90089: Set connection var ansible_pipelining to False 28023 1726853617.90093: Set connection var ansible_timeout to 10 28023 1726853617.90112: variable 'ansible_shell_executable' from source: unknown 28023 1726853617.90115: variable 'ansible_connection' from source: unknown 28023 1726853617.90117: variable 'ansible_module_compression' from source: unknown 28023 1726853617.90120: variable 'ansible_shell_type' from source: unknown 28023 1726853617.90122: variable 'ansible_shell_executable' from source: unknown 28023 1726853617.90124: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853617.90128: variable 'ansible_pipelining' from source: unknown 28023 1726853617.90130: variable 'ansible_timeout' from source: unknown 28023 1726853617.90134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853617.90233: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853617.90242: variable 'omit' from source: magic vars 28023 1726853617.90248: starting attempt loop 28023 1726853617.90251: running the handler 28023 1726853617.90261: handler run complete 28023 1726853617.90268: attempt loop complete, returning result 28023 1726853617.90272: _execute() done 28023 1726853617.90275: dumping result to json 28023 1726853617.90278: done dumping result, returning 28023 1726853617.90286: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [02083763-bbaf-fdb6-dad7-0000000003ac] 28023 1726853617.90290: sending task result for task 02083763-bbaf-fdb6-dad7-0000000003ac 28023 1726853617.90377: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000003ac 28023 1726853617.90379: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "ethtest0", "lo", "peerethtest0", "rpltstbr" ] }, "changed": false } 28023 1726853617.90508: no more pending results, returning what we have 28023 1726853617.90511: results queue empty 28023 1726853617.90511: checking for any_errors_fatal 28023 1726853617.90517: done checking for any_errors_fatal 28023 1726853617.90518: checking for max_fail_percentage 28023 1726853617.90519: done checking for max_fail_percentage 28023 1726853617.90520: checking to see if all hosts have failed and the running result is not ok 28023 1726853617.90521: done checking to see if all hosts have failed 28023 1726853617.90522: getting the remaining hosts for this loop 28023 1726853617.90523: done getting the remaining hosts for this loop 28023 1726853617.90526: getting the next task for host managed_node3 28023 1726853617.90540: done getting next task for host managed_node3 28023 1726853617.90542: ^ task is: TASK: Show current_interfaces 28023 1726853617.90546: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853617.90550: getting variables 28023 1726853617.90551: in VariableManager get_vars() 28023 1726853617.90614: Calling all_inventory to load vars for managed_node3 28023 1726853617.90616: Calling groups_inventory to load vars for managed_node3 28023 1726853617.90619: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853617.90628: Calling all_plugins_play to load vars for managed_node3 28023 1726853617.90631: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853617.90635: Calling groups_plugins_play to load vars for managed_node3 28023 1726853617.90812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853617.91057: done with get_vars() 28023 1726853617.91067: done getting variables 28023 1726853617.91123: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:33:37 -0400 (0:00:00.023) 0:00:09.995 ****** 28023 1726853617.91152: entering _queue_task() for managed_node3/debug 28023 1726853617.91397: worker is 1 (out of 1 available) 28023 1726853617.91409: exiting _queue_task() for managed_node3/debug 28023 1726853617.91421: done queuing things up, now waiting for results queue to drain 28023 1726853617.91423: waiting for pending results... 28023 1726853617.91783: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 28023 1726853617.91792: in run() - task 02083763-bbaf-fdb6-dad7-000000000375 28023 1726853617.91796: variable 'ansible_search_path' from source: unknown 28023 1726853617.91800: variable 'ansible_search_path' from source: unknown 28023 1726853617.91842: calling self._execute() 28023 1726853617.91936: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853617.91949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853617.91964: variable 'omit' from source: magic vars 28023 1726853617.92261: variable 'ansible_distribution_major_version' from source: facts 28023 1726853617.92285: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853617.92296: variable 'omit' from source: magic vars 28023 1726853617.92322: variable 'omit' from source: magic vars 28023 1726853617.92391: variable 'current_interfaces' from source: set_fact 28023 1726853617.92415: variable 'omit' from source: magic vars 28023 1726853617.92446: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853617.92475: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853617.92491: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853617.92511: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853617.92514: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853617.92538: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853617.92541: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853617.92544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853617.92611: Set connection var ansible_shell_type to sh 28023 1726853617.92621: Set connection var ansible_shell_executable to /bin/sh 28023 1726853617.92623: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853617.92628: Set connection var ansible_connection to ssh 28023 1726853617.92634: Set connection var ansible_pipelining to False 28023 1726853617.92639: Set connection var ansible_timeout to 10 28023 1726853617.92663: variable 'ansible_shell_executable' from source: unknown 28023 1726853617.92666: variable 'ansible_connection' from source: unknown 28023 1726853617.92669: variable 'ansible_module_compression' from source: unknown 28023 1726853617.92672: variable 'ansible_shell_type' from source: unknown 28023 1726853617.92675: variable 'ansible_shell_executable' from source: unknown 28023 1726853617.92677: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853617.92679: variable 'ansible_pipelining' from source: unknown 28023 1726853617.92681: variable 'ansible_timeout' from source: unknown 28023 1726853617.92683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853617.92783: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853617.92791: variable 'omit' from source: magic vars 28023 1726853617.92796: starting attempt loop 28023 1726853617.92799: running the handler 28023 1726853617.92835: handler run complete 28023 1726853617.92851: attempt loop complete, returning result 28023 1726853617.92854: _execute() done 28023 1726853617.92857: dumping result to json 28023 1726853617.92859: done dumping result, returning 28023 1726853617.92868: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [02083763-bbaf-fdb6-dad7-000000000375] 28023 1726853617.92872: sending task result for task 02083763-bbaf-fdb6-dad7-000000000375 28023 1726853617.92948: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000375 28023 1726853617.92951: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'ethtest0', 'lo', 'peerethtest0', 'rpltstbr'] 28023 1726853617.93022: no more pending results, returning what we have 28023 1726853617.93026: results queue empty 28023 1726853617.93026: checking for any_errors_fatal 28023 1726853617.93031: done checking for any_errors_fatal 28023 1726853617.93032: checking for max_fail_percentage 28023 1726853617.93034: done checking for max_fail_percentage 28023 1726853617.93036: checking to see if all hosts have failed and the running result is not ok 28023 1726853617.93037: done checking to see if all hosts have failed 28023 1726853617.93037: getting the remaining hosts for this loop 28023 1726853617.93039: done getting the remaining hosts for this loop 28023 1726853617.93042: getting the next task for host managed_node3 28023 1726853617.93050: done getting next task for host managed_node3 28023 1726853617.93053: ^ task is: TASK: Install iproute 28023 1726853617.93055: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853617.93059: getting variables 28023 1726853617.93062: in VariableManager get_vars() 28023 1726853617.93098: Calling all_inventory to load vars for managed_node3 28023 1726853617.93100: Calling groups_inventory to load vars for managed_node3 28023 1726853617.93102: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853617.93111: Calling all_plugins_play to load vars for managed_node3 28023 1726853617.93114: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853617.93116: Calling groups_plugins_play to load vars for managed_node3 28023 1726853617.93242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853617.93369: done with get_vars() 28023 1726853617.93378: done getting variables 28023 1726853617.93417: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 13:33:37 -0400 (0:00:00.022) 0:00:10.018 ****** 28023 1726853617.93437: entering _queue_task() for managed_node3/package 28023 1726853617.93638: worker is 1 (out of 1 available) 28023 1726853617.93651: exiting _queue_task() for managed_node3/package 28023 1726853617.93664: done queuing things up, now waiting for results queue to drain 28023 1726853617.93665: waiting for pending results... 28023 1726853617.93844: running TaskExecutor() for managed_node3/TASK: Install iproute 28023 1726853617.93921: in run() - task 02083763-bbaf-fdb6-dad7-0000000002ff 28023 1726853617.93941: variable 'ansible_search_path' from source: unknown 28023 1726853617.93978: variable 'ansible_search_path' from source: unknown 28023 1726853617.93991: calling self._execute() 28023 1726853617.94077: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853617.94114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853617.94117: variable 'omit' from source: magic vars 28023 1726853617.94476: variable 'ansible_distribution_major_version' from source: facts 28023 1726853617.94479: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853617.94482: variable 'omit' from source: magic vars 28023 1726853617.94502: variable 'omit' from source: magic vars 28023 1726853617.94687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853617.96533: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853617.96580: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853617.96609: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853617.96634: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853617.96654: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853617.96727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853617.96755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853617.96776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853617.96806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853617.96816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853617.96898: variable '__network_is_ostree' from source: set_fact 28023 1726853617.96902: variable 'omit' from source: magic vars 28023 1726853617.96921: variable 'omit' from source: magic vars 28023 1726853617.96942: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853617.96964: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853617.96980: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853617.96993: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853617.97005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853617.97027: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853617.97030: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853617.97034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853617.97100: Set connection var ansible_shell_type to sh 28023 1726853617.97111: Set connection var ansible_shell_executable to /bin/sh 28023 1726853617.97114: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853617.97117: Set connection var ansible_connection to ssh 28023 1726853617.97125: Set connection var ansible_pipelining to False 28023 1726853617.97128: Set connection var ansible_timeout to 10 28023 1726853617.97147: variable 'ansible_shell_executable' from source: unknown 28023 1726853617.97150: variable 'ansible_connection' from source: unknown 28023 1726853617.97153: variable 'ansible_module_compression' from source: unknown 28023 1726853617.97155: variable 'ansible_shell_type' from source: unknown 28023 1726853617.97157: variable 'ansible_shell_executable' from source: unknown 28023 1726853617.97163: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853617.97165: variable 'ansible_pipelining' from source: unknown 28023 1726853617.97169: variable 'ansible_timeout' from source: unknown 28023 1726853617.97174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853617.97244: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853617.97253: variable 'omit' from source: magic vars 28023 1726853617.97258: starting attempt loop 28023 1726853617.97264: running the handler 28023 1726853617.97269: variable 'ansible_facts' from source: unknown 28023 1726853617.97274: variable 'ansible_facts' from source: unknown 28023 1726853617.97300: _low_level_execute_command(): starting 28023 1726853617.97307: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853617.97776: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853617.97802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853617.97806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853617.97911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853617.97915: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853617.97989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853617.99709: stdout chunk (state=3): >>>/root <<< 28023 1726853617.99809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853617.99837: stderr chunk (state=3): >>><<< 28023 1726853617.99841: stdout chunk (state=3): >>><<< 28023 1726853617.99861: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853617.99883: _low_level_execute_command(): starting 28023 1726853617.99887: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853617.9986353-28563-161881046210985 `" && echo ansible-tmp-1726853617.9986353-28563-161881046210985="` echo /root/.ansible/tmp/ansible-tmp-1726853617.9986353-28563-161881046210985 `" ) && sleep 0' 28023 1726853618.00328: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853618.00332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853618.00334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853618.00336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853618.00377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853618.00395: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853618.00455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853618.02425: stdout chunk (state=3): >>>ansible-tmp-1726853617.9986353-28563-161881046210985=/root/.ansible/tmp/ansible-tmp-1726853617.9986353-28563-161881046210985 <<< 28023 1726853618.02530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853618.02557: stderr chunk (state=3): >>><<< 28023 1726853618.02561: stdout chunk (state=3): >>><<< 28023 1726853618.02583: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853617.9986353-28563-161881046210985=/root/.ansible/tmp/ansible-tmp-1726853617.9986353-28563-161881046210985 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853618.02609: variable 'ansible_module_compression' from source: unknown 28023 1726853618.02652: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 28023 1726853618.02693: variable 'ansible_facts' from source: unknown 28023 1726853618.02775: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853617.9986353-28563-161881046210985/AnsiballZ_dnf.py 28023 1726853618.02884: Sending initial data 28023 1726853618.02888: Sent initial data (152 bytes) 28023 1726853618.03336: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853618.03339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853618.03342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853618.03344: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853618.03347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853618.03414: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853618.03421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853618.03484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853618.05117: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28023 1726853618.05120: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853618.05178: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853618.05233: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmp6635kvvt /root/.ansible/tmp/ansible-tmp-1726853617.9986353-28563-161881046210985/AnsiballZ_dnf.py <<< 28023 1726853618.05236: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853617.9986353-28563-161881046210985/AnsiballZ_dnf.py" <<< 28023 1726853618.05300: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 28023 1726853618.05303: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmp6635kvvt" to remote "/root/.ansible/tmp/ansible-tmp-1726853617.9986353-28563-161881046210985/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853617.9986353-28563-161881046210985/AnsiballZ_dnf.py" <<< 28023 1726853618.06026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853618.06073: stderr chunk (state=3): >>><<< 28023 1726853618.06076: stdout chunk (state=3): >>><<< 28023 1726853618.06113: done transferring module to remote 28023 1726853618.06122: _low_level_execute_command(): starting 28023 1726853618.06127: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853617.9986353-28563-161881046210985/ /root/.ansible/tmp/ansible-tmp-1726853617.9986353-28563-161881046210985/AnsiballZ_dnf.py && sleep 0' 28023 1726853618.06579: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853618.06582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853618.06584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853618.06586: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 28023 1726853618.06590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853618.06592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853618.06644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853618.06651: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853618.06653: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853618.06713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853618.08541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853618.08569: stderr chunk (state=3): >>><<< 28023 1726853618.08574: stdout chunk (state=3): >>><<< 28023 1726853618.08583: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853618.08586: _low_level_execute_command(): starting 28023 1726853618.08591: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853617.9986353-28563-161881046210985/AnsiballZ_dnf.py && sleep 0' 28023 1726853618.09005: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853618.09009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853618.09011: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853618.09013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853618.09062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853618.09065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853618.09137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853618.51270: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 28023 1726853618.55713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853618.55737: stderr chunk (state=3): >>><<< 28023 1726853618.55741: stdout chunk (state=3): >>><<< 28023 1726853618.55759: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853618.55797: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853617.9986353-28563-161881046210985/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853618.55806: _low_level_execute_command(): starting 28023 1726853618.55808: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853617.9986353-28563-161881046210985/ > /dev/null 2>&1 && sleep 0' 28023 1726853618.56242: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853618.56246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853618.56248: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853618.56250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853618.56301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853618.56304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853618.56374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853618.58277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853618.58281: stdout chunk (state=3): >>><<< 28023 1726853618.58284: stderr chunk (state=3): >>><<< 28023 1726853618.58378: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853618.58382: handler run complete 28023 1726853618.58501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853618.58636: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853618.58746: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853618.58750: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853618.58752: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853618.58754: variable '__install_status' from source: set_fact 28023 1726853618.58769: Evaluated conditional (__install_status is success): True 28023 1726853618.58783: attempt loop complete, returning result 28023 1726853618.58786: _execute() done 28023 1726853618.58788: dumping result to json 28023 1726853618.58794: done dumping result, returning 28023 1726853618.58801: done running TaskExecutor() for managed_node3/TASK: Install iproute [02083763-bbaf-fdb6-dad7-0000000002ff] 28023 1726853618.58805: sending task result for task 02083763-bbaf-fdb6-dad7-0000000002ff 28023 1726853618.58903: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000002ff 28023 1726853618.58905: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 28023 1726853618.58984: no more pending results, returning what we have 28023 1726853618.58987: results queue empty 28023 1726853618.58988: checking for any_errors_fatal 28023 1726853618.58993: done checking for any_errors_fatal 28023 1726853618.58993: checking for max_fail_percentage 28023 1726853618.58995: done checking for max_fail_percentage 28023 1726853618.58995: checking to see if all hosts have failed and the running result is not ok 28023 1726853618.58996: done checking to see if all hosts have failed 28023 1726853618.58997: getting the remaining hosts for this loop 28023 1726853618.58999: done getting the remaining hosts for this loop 28023 1726853618.59002: getting the next task for host managed_node3 28023 1726853618.59008: done getting next task for host managed_node3 28023 1726853618.59011: ^ task is: TASK: Create veth interface {{ interface }} 28023 1726853618.59013: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853618.59018: getting variables 28023 1726853618.59019: in VariableManager get_vars() 28023 1726853618.59060: Calling all_inventory to load vars for managed_node3 28023 1726853618.59063: Calling groups_inventory to load vars for managed_node3 28023 1726853618.59065: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853618.59082: Calling all_plugins_play to load vars for managed_node3 28023 1726853618.59084: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853618.59087: Calling groups_plugins_play to load vars for managed_node3 28023 1726853618.59265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853618.59393: done with get_vars() 28023 1726853618.59403: done getting variables 28023 1726853618.59444: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853618.59531: variable 'interface' from source: set_fact TASK [Create veth interface ethtest1] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 13:33:38 -0400 (0:00:00.661) 0:00:10.679 ****** 28023 1726853618.59552: entering _queue_task() for managed_node3/command 28023 1726853618.59759: worker is 1 (out of 1 available) 28023 1726853618.59774: exiting _queue_task() for managed_node3/command 28023 1726853618.59786: done queuing things up, now waiting for results queue to drain 28023 1726853618.59788: waiting for pending results... 28023 1726853618.59945: running TaskExecutor() for managed_node3/TASK: Create veth interface ethtest1 28023 1726853618.60021: in run() - task 02083763-bbaf-fdb6-dad7-000000000300 28023 1726853618.60027: variable 'ansible_search_path' from source: unknown 28023 1726853618.60030: variable 'ansible_search_path' from source: unknown 28023 1726853618.60234: variable 'interface' from source: set_fact 28023 1726853618.60289: variable 'interface' from source: set_fact 28023 1726853618.60476: variable 'interface' from source: set_fact 28023 1726853618.60501: Loaded config def from plugin (lookup/items) 28023 1726853618.60514: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 28023 1726853618.60540: variable 'omit' from source: magic vars 28023 1726853618.60687: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853618.60707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853618.60721: variable 'omit' from source: magic vars 28023 1726853618.60960: variable 'ansible_distribution_major_version' from source: facts 28023 1726853618.60976: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853618.61191: variable 'type' from source: set_fact 28023 1726853618.61201: variable 'state' from source: include params 28023 1726853618.61208: variable 'interface' from source: set_fact 28023 1726853618.61216: variable 'current_interfaces' from source: set_fact 28023 1726853618.61226: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 28023 1726853618.61246: variable 'omit' from source: magic vars 28023 1726853618.61289: variable 'omit' from source: magic vars 28023 1726853618.61336: variable 'item' from source: unknown 28023 1726853618.61414: variable 'item' from source: unknown 28023 1726853618.61421: variable 'omit' from source: magic vars 28023 1726853618.61446: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853618.61476: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853618.61491: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853618.61504: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853618.61513: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853618.61535: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853618.61538: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853618.61541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853618.61611: Set connection var ansible_shell_type to sh 28023 1726853618.61618: Set connection var ansible_shell_executable to /bin/sh 28023 1726853618.61623: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853618.61628: Set connection var ansible_connection to ssh 28023 1726853618.61633: Set connection var ansible_pipelining to False 28023 1726853618.61638: Set connection var ansible_timeout to 10 28023 1726853618.61655: variable 'ansible_shell_executable' from source: unknown 28023 1726853618.61660: variable 'ansible_connection' from source: unknown 28023 1726853618.61663: variable 'ansible_module_compression' from source: unknown 28023 1726853618.61665: variable 'ansible_shell_type' from source: unknown 28023 1726853618.61675: variable 'ansible_shell_executable' from source: unknown 28023 1726853618.61677: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853618.61680: variable 'ansible_pipelining' from source: unknown 28023 1726853618.61682: variable 'ansible_timeout' from source: unknown 28023 1726853618.61683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853618.61775: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853618.61783: variable 'omit' from source: magic vars 28023 1726853618.61788: starting attempt loop 28023 1726853618.61793: running the handler 28023 1726853618.61806: _low_level_execute_command(): starting 28023 1726853618.61813: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853618.62303: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853618.62307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853618.62310: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853618.62312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853618.62365: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853618.62374: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853618.62437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853618.64116: stdout chunk (state=3): >>>/root <<< 28023 1726853618.64211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853618.64237: stderr chunk (state=3): >>><<< 28023 1726853618.64241: stdout chunk (state=3): >>><<< 28023 1726853618.64261: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853618.64276: _low_level_execute_command(): starting 28023 1726853618.64287: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853618.6426203-28594-155108171949020 `" && echo ansible-tmp-1726853618.6426203-28594-155108171949020="` echo /root/.ansible/tmp/ansible-tmp-1726853618.6426203-28594-155108171949020 `" ) && sleep 0' 28023 1726853618.64923: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853618.64944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853618.64962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853618.64992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853618.65074: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853618.65116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853618.65142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853618.65192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853618.65253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853618.67278: stdout chunk (state=3): >>>ansible-tmp-1726853618.6426203-28594-155108171949020=/root/.ansible/tmp/ansible-tmp-1726853618.6426203-28594-155108171949020 <<< 28023 1726853618.67416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853618.67428: stdout chunk (state=3): >>><<< 28023 1726853618.67444: stderr chunk (state=3): >>><<< 28023 1726853618.67479: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853618.6426203-28594-155108171949020=/root/.ansible/tmp/ansible-tmp-1726853618.6426203-28594-155108171949020 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853618.67553: variable 'ansible_module_compression' from source: unknown 28023 1726853618.67572: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28023 1726853618.67601: variable 'ansible_facts' from source: unknown 28023 1726853618.67655: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853618.6426203-28594-155108171949020/AnsiballZ_command.py 28023 1726853618.67755: Sending initial data 28023 1726853618.67758: Sent initial data (156 bytes) 28023 1726853618.68197: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853618.68200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853618.68202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 28023 1726853618.68204: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853618.68207: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853618.68257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853618.68260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853618.68326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853618.69947: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 28023 1726853618.69951: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853618.70002: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853618.70064: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmplljy929w /root/.ansible/tmp/ansible-tmp-1726853618.6426203-28594-155108171949020/AnsiballZ_command.py <<< 28023 1726853618.70066: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853618.6426203-28594-155108171949020/AnsiballZ_command.py" <<< 28023 1726853618.70117: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmplljy929w" to remote "/root/.ansible/tmp/ansible-tmp-1726853618.6426203-28594-155108171949020/AnsiballZ_command.py" <<< 28023 1726853618.70120: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853618.6426203-28594-155108171949020/AnsiballZ_command.py" <<< 28023 1726853618.70726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853618.70763: stderr chunk (state=3): >>><<< 28023 1726853618.70766: stdout chunk (state=3): >>><<< 28023 1726853618.70790: done transferring module to remote 28023 1726853618.70799: _low_level_execute_command(): starting 28023 1726853618.70805: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853618.6426203-28594-155108171949020/ /root/.ansible/tmp/ansible-tmp-1726853618.6426203-28594-155108171949020/AnsiballZ_command.py && sleep 0' 28023 1726853618.71255: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853618.71258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853618.71261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 28023 1726853618.71263: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853618.71265: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853618.71313: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853618.71317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853618.71386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853618.73178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853618.73203: stderr chunk (state=3): >>><<< 28023 1726853618.73206: stdout chunk (state=3): >>><<< 28023 1726853618.73224: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853618.73227: _low_level_execute_command(): starting 28023 1726853618.73231: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853618.6426203-28594-155108171949020/AnsiballZ_command.py && sleep 0' 28023 1726853618.73668: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853618.73674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853618.73676: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853618.73679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853618.73681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853618.73728: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853618.73732: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853618.73804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853618.89823: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest1", "type", "veth", "peer", "name", "peerethtest1"], "start": "2024-09-20 13:33:38.889587", "end": "2024-09-20 13:33:38.896136", "delta": "0:00:00.006549", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest1 type veth peer name peerethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28023 1726853618.93002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853618.93006: stdout chunk (state=3): >>><<< 28023 1726853618.93008: stderr chunk (state=3): >>><<< 28023 1726853618.93025: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest1", "type", "veth", "peer", "name", "peerethtest1"], "start": "2024-09-20 13:33:38.889587", "end": "2024-09-20 13:33:38.896136", "delta": "0:00:00.006549", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest1 type veth peer name peerethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853618.93087: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest1 type veth peer name peerethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853618.6426203-28594-155108171949020/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853618.93177: _low_level_execute_command(): starting 28023 1726853618.93180: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853618.6426203-28594-155108171949020/ > /dev/null 2>&1 && sleep 0' 28023 1726853618.93736: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853618.93749: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853618.93767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853618.93887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853618.93901: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853618.93918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853618.94025: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853618.98040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853618.98054: stdout chunk (state=3): >>><<< 28023 1726853618.98067: stderr chunk (state=3): >>><<< 28023 1726853618.98094: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853618.98107: handler run complete 28023 1726853618.98135: Evaluated conditional (False): False 28023 1726853618.98150: attempt loop complete, returning result 28023 1726853618.98177: variable 'item' from source: unknown 28023 1726853618.98257: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link add ethtest1 type veth peer name peerethtest1) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest1", "type", "veth", "peer", "name", "peerethtest1" ], "delta": "0:00:00.006549", "end": "2024-09-20 13:33:38.896136", "item": "ip link add ethtest1 type veth peer name peerethtest1", "rc": 0, "start": "2024-09-20 13:33:38.889587" } 28023 1726853618.98686: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853618.98690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853618.98692: variable 'omit' from source: magic vars 28023 1726853618.98702: variable 'ansible_distribution_major_version' from source: facts 28023 1726853618.98712: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853618.98892: variable 'type' from source: set_fact 28023 1726853618.98907: variable 'state' from source: include params 28023 1726853618.98916: variable 'interface' from source: set_fact 28023 1726853618.98924: variable 'current_interfaces' from source: set_fact 28023 1726853618.98935: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 28023 1726853618.98944: variable 'omit' from source: magic vars 28023 1726853618.98963: variable 'omit' from source: magic vars 28023 1726853618.99120: variable 'item' from source: unknown 28023 1726853618.99124: variable 'item' from source: unknown 28023 1726853618.99126: variable 'omit' from source: magic vars 28023 1726853618.99129: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853618.99131: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853618.99140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853618.99159: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853618.99168: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853618.99178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853618.99259: Set connection var ansible_shell_type to sh 28023 1726853618.99269: Set connection var ansible_shell_executable to /bin/sh 28023 1726853618.99280: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853618.99287: Set connection var ansible_connection to ssh 28023 1726853618.99294: Set connection var ansible_pipelining to False 28023 1726853618.99301: Set connection var ansible_timeout to 10 28023 1726853618.99323: variable 'ansible_shell_executable' from source: unknown 28023 1726853618.99329: variable 'ansible_connection' from source: unknown 28023 1726853618.99337: variable 'ansible_module_compression' from source: unknown 28023 1726853618.99342: variable 'ansible_shell_type' from source: unknown 28023 1726853618.99347: variable 'ansible_shell_executable' from source: unknown 28023 1726853618.99352: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853618.99357: variable 'ansible_pipelining' from source: unknown 28023 1726853618.99362: variable 'ansible_timeout' from source: unknown 28023 1726853618.99368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853618.99465: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853618.99483: variable 'omit' from source: magic vars 28023 1726853618.99492: starting attempt loop 28023 1726853618.99498: running the handler 28023 1726853618.99510: _low_level_execute_command(): starting 28023 1726853618.99518: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853619.00137: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853619.00150: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853619.00164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853619.00202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853619.00299: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853619.00328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853619.00421: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853619.02291: stdout chunk (state=3): >>>/root <<< 28023 1726853619.02301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853619.02314: stdout chunk (state=3): >>><<< 28023 1726853619.02388: stderr chunk (state=3): >>><<< 28023 1726853619.02392: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853619.02395: _low_level_execute_command(): starting 28023 1726853619.02397: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853619.0234077-28594-278872830011666 `" && echo ansible-tmp-1726853619.0234077-28594-278872830011666="` echo /root/.ansible/tmp/ansible-tmp-1726853619.0234077-28594-278872830011666 `" ) && sleep 0' 28023 1726853619.03011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853619.03027: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853619.03050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853619.03158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853619.03183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853619.03200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853619.03300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853619.05284: stdout chunk (state=3): >>>ansible-tmp-1726853619.0234077-28594-278872830011666=/root/.ansible/tmp/ansible-tmp-1726853619.0234077-28594-278872830011666 <<< 28023 1726853619.05442: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853619.05446: stdout chunk (state=3): >>><<< 28023 1726853619.05448: stderr chunk (state=3): >>><<< 28023 1726853619.05577: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853619.0234077-28594-278872830011666=/root/.ansible/tmp/ansible-tmp-1726853619.0234077-28594-278872830011666 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853619.05581: variable 'ansible_module_compression' from source: unknown 28023 1726853619.05583: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28023 1726853619.05585: variable 'ansible_facts' from source: unknown 28023 1726853619.05649: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853619.0234077-28594-278872830011666/AnsiballZ_command.py 28023 1726853619.05870: Sending initial data 28023 1726853619.05952: Sent initial data (156 bytes) 28023 1726853619.06594: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853619.06628: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853619.06650: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853619.06675: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853619.06767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853619.08401: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853619.08484: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853619.08560: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmploi2q_nv /root/.ansible/tmp/ansible-tmp-1726853619.0234077-28594-278872830011666/AnsiballZ_command.py <<< 28023 1726853619.08565: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853619.0234077-28594-278872830011666/AnsiballZ_command.py" <<< 28023 1726853619.08610: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmploi2q_nv" to remote "/root/.ansible/tmp/ansible-tmp-1726853619.0234077-28594-278872830011666/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853619.0234077-28594-278872830011666/AnsiballZ_command.py" <<< 28023 1726853619.09533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853619.09536: stderr chunk (state=3): >>><<< 28023 1726853619.09539: stdout chunk (state=3): >>><<< 28023 1726853619.09541: done transferring module to remote 28023 1726853619.09543: _low_level_execute_command(): starting 28023 1726853619.09545: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853619.0234077-28594-278872830011666/ /root/.ansible/tmp/ansible-tmp-1726853619.0234077-28594-278872830011666/AnsiballZ_command.py && sleep 0' 28023 1726853619.10189: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853619.10211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853619.10286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853619.10338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853619.10353: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853619.10376: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853619.10461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853619.12352: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853619.12406: stdout chunk (state=3): >>><<< 28023 1726853619.12410: stderr chunk (state=3): >>><<< 28023 1726853619.12426: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853619.12435: _low_level_execute_command(): starting 28023 1726853619.12476: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853619.0234077-28594-278872830011666/AnsiballZ_command.py && sleep 0' 28023 1726853619.13091: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853619.13104: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853619.13116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853619.13193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853619.13233: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853619.13258: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853619.13324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853619.29143: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest1", "up"], "start": "2024-09-20 13:33:39.286168", "end": "2024-09-20 13:33:39.290203", "delta": "0:00:00.004035", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28023 1726853619.30733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853619.30763: stderr chunk (state=3): >>><<< 28023 1726853619.30766: stdout chunk (state=3): >>><<< 28023 1726853619.30784: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest1", "up"], "start": "2024-09-20 13:33:39.286168", "end": "2024-09-20 13:33:39.290203", "delta": "0:00:00.004035", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853619.30813: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest1 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853619.0234077-28594-278872830011666/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853619.30818: _low_level_execute_command(): starting 28023 1726853619.30823: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853619.0234077-28594-278872830011666/ > /dev/null 2>&1 && sleep 0' 28023 1726853619.31249: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853619.31260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853619.31283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 28023 1726853619.31286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853619.31341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853619.31349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853619.31353: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853619.31405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853619.33257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853619.33287: stderr chunk (state=3): >>><<< 28023 1726853619.33290: stdout chunk (state=3): >>><<< 28023 1726853619.33303: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853619.33308: handler run complete 28023 1726853619.33323: Evaluated conditional (False): False 28023 1726853619.33331: attempt loop complete, returning result 28023 1726853619.33345: variable 'item' from source: unknown 28023 1726853619.33410: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set peerethtest1 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest1", "up" ], "delta": "0:00:00.004035", "end": "2024-09-20 13:33:39.290203", "item": "ip link set peerethtest1 up", "rc": 0, "start": "2024-09-20 13:33:39.286168" } 28023 1726853619.33525: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853619.33528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853619.33531: variable 'omit' from source: magic vars 28023 1726853619.33625: variable 'ansible_distribution_major_version' from source: facts 28023 1726853619.33629: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853619.33745: variable 'type' from source: set_fact 28023 1726853619.33750: variable 'state' from source: include params 28023 1726853619.33753: variable 'interface' from source: set_fact 28023 1726853619.33755: variable 'current_interfaces' from source: set_fact 28023 1726853619.33760: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 28023 1726853619.33770: variable 'omit' from source: magic vars 28023 1726853619.33781: variable 'omit' from source: magic vars 28023 1726853619.33807: variable 'item' from source: unknown 28023 1726853619.33851: variable 'item' from source: unknown 28023 1726853619.33862: variable 'omit' from source: magic vars 28023 1726853619.33883: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853619.33890: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853619.33898: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853619.33907: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853619.33910: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853619.33912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853619.33960: Set connection var ansible_shell_type to sh 28023 1726853619.33963: Set connection var ansible_shell_executable to /bin/sh 28023 1726853619.33965: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853619.33972: Set connection var ansible_connection to ssh 28023 1726853619.33977: Set connection var ansible_pipelining to False 28023 1726853619.33987: Set connection var ansible_timeout to 10 28023 1726853619.34001: variable 'ansible_shell_executable' from source: unknown 28023 1726853619.34004: variable 'ansible_connection' from source: unknown 28023 1726853619.34007: variable 'ansible_module_compression' from source: unknown 28023 1726853619.34009: variable 'ansible_shell_type' from source: unknown 28023 1726853619.34011: variable 'ansible_shell_executable' from source: unknown 28023 1726853619.34013: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853619.34017: variable 'ansible_pipelining' from source: unknown 28023 1726853619.34020: variable 'ansible_timeout' from source: unknown 28023 1726853619.34024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853619.34087: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853619.34102: variable 'omit' from source: magic vars 28023 1726853619.34105: starting attempt loop 28023 1726853619.34107: running the handler 28023 1726853619.34110: _low_level_execute_command(): starting 28023 1726853619.34112: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853619.34539: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853619.34561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853619.34564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853619.34578: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853619.34630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853619.34634: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853619.34636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853619.34703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853619.36366: stdout chunk (state=3): >>>/root <<< 28023 1726853619.36466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853619.36496: stderr chunk (state=3): >>><<< 28023 1726853619.36499: stdout chunk (state=3): >>><<< 28023 1726853619.36513: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853619.36520: _low_level_execute_command(): starting 28023 1726853619.36525: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853619.3651226-28594-245080571550858 `" && echo ansible-tmp-1726853619.3651226-28594-245080571550858="` echo /root/.ansible/tmp/ansible-tmp-1726853619.3651226-28594-245080571550858 `" ) && sleep 0' 28023 1726853619.36931: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853619.36972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853619.36977: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853619.36979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853619.36981: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853619.36983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853619.36985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853619.37025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853619.37032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853619.37035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853619.37093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853619.39026: stdout chunk (state=3): >>>ansible-tmp-1726853619.3651226-28594-245080571550858=/root/.ansible/tmp/ansible-tmp-1726853619.3651226-28594-245080571550858 <<< 28023 1726853619.39132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853619.39159: stderr chunk (state=3): >>><<< 28023 1726853619.39162: stdout chunk (state=3): >>><<< 28023 1726853619.39175: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853619.3651226-28594-245080571550858=/root/.ansible/tmp/ansible-tmp-1726853619.3651226-28594-245080571550858 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853619.39194: variable 'ansible_module_compression' from source: unknown 28023 1726853619.39222: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28023 1726853619.39239: variable 'ansible_facts' from source: unknown 28023 1726853619.39286: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853619.3651226-28594-245080571550858/AnsiballZ_command.py 28023 1726853619.39375: Sending initial data 28023 1726853619.39378: Sent initial data (156 bytes) 28023 1726853619.39815: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853619.39818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853619.39821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853619.39823: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853619.39825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853619.39879: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853619.39884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853619.39894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853619.39944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853619.41523: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28023 1726853619.41532: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853619.41583: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853619.41641: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmps3jmh0d1 /root/.ansible/tmp/ansible-tmp-1726853619.3651226-28594-245080571550858/AnsiballZ_command.py <<< 28023 1726853619.41649: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853619.3651226-28594-245080571550858/AnsiballZ_command.py" <<< 28023 1726853619.41699: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmps3jmh0d1" to remote "/root/.ansible/tmp/ansible-tmp-1726853619.3651226-28594-245080571550858/AnsiballZ_command.py" <<< 28023 1726853619.41702: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853619.3651226-28594-245080571550858/AnsiballZ_command.py" <<< 28023 1726853619.42300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853619.42340: stderr chunk (state=3): >>><<< 28023 1726853619.42343: stdout chunk (state=3): >>><<< 28023 1726853619.42381: done transferring module to remote 28023 1726853619.42387: _low_level_execute_command(): starting 28023 1726853619.42392: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853619.3651226-28594-245080571550858/ /root/.ansible/tmp/ansible-tmp-1726853619.3651226-28594-245080571550858/AnsiballZ_command.py && sleep 0' 28023 1726853619.42836: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853619.42839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853619.42842: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853619.42844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853619.42845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853619.42889: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853619.42892: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853619.42955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853619.44784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853619.44804: stderr chunk (state=3): >>><<< 28023 1726853619.44808: stdout chunk (state=3): >>><<< 28023 1726853619.44820: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853619.44823: _low_level_execute_command(): starting 28023 1726853619.44827: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853619.3651226-28594-245080571550858/AnsiballZ_command.py && sleep 0' 28023 1726853619.45273: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853619.45276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853619.45283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853619.45285: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28023 1726853619.45287: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853619.45289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853619.45332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853619.45336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853619.45404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853619.61369: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest1", "up"], "start": "2024-09-20 13:33:39.608045", "end": "2024-09-20 13:33:39.611822", "delta": "0:00:00.003777", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28023 1726853619.63005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853619.63059: stderr chunk (state=3): >>><<< 28023 1726853619.63062: stdout chunk (state=3): >>><<< 28023 1726853619.63065: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest1", "up"], "start": "2024-09-20 13:33:39.608045", "end": "2024-09-20 13:33:39.611822", "delta": "0:00:00.003777", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853619.63182: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest1 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853619.3651226-28594-245080571550858/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853619.63186: _low_level_execute_command(): starting 28023 1726853619.63188: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853619.3651226-28594-245080571550858/ > /dev/null 2>&1 && sleep 0' 28023 1726853619.63710: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853619.63713: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853619.63730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853619.63802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853619.63805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853619.63842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853619.63854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853619.63885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853619.63962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853619.65815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853619.65841: stderr chunk (state=3): >>><<< 28023 1726853619.65844: stdout chunk (state=3): >>><<< 28023 1726853619.65863: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853619.65868: handler run complete 28023 1726853619.65885: Evaluated conditional (False): False 28023 1726853619.65903: attempt loop complete, returning result 28023 1726853619.65923: variable 'item' from source: unknown 28023 1726853619.66039: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set ethtest1 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest1", "up" ], "delta": "0:00:00.003777", "end": "2024-09-20 13:33:39.611822", "item": "ip link set ethtest1 up", "rc": 0, "start": "2024-09-20 13:33:39.608045" } 28023 1726853619.66137: dumping result to json 28023 1726853619.66139: done dumping result, returning 28023 1726853619.66141: done running TaskExecutor() for managed_node3/TASK: Create veth interface ethtest1 [02083763-bbaf-fdb6-dad7-000000000300] 28023 1726853619.66143: sending task result for task 02083763-bbaf-fdb6-dad7-000000000300 28023 1726853619.66475: no more pending results, returning what we have 28023 1726853619.66479: results queue empty 28023 1726853619.66487: checking for any_errors_fatal 28023 1726853619.66493: done checking for any_errors_fatal 28023 1726853619.66494: checking for max_fail_percentage 28023 1726853619.66496: done checking for max_fail_percentage 28023 1726853619.66496: checking to see if all hosts have failed and the running result is not ok 28023 1726853619.66497: done checking to see if all hosts have failed 28023 1726853619.66498: getting the remaining hosts for this loop 28023 1726853619.66499: done getting the remaining hosts for this loop 28023 1726853619.66502: getting the next task for host managed_node3 28023 1726853619.66508: done getting next task for host managed_node3 28023 1726853619.66510: ^ task is: TASK: Set up veth as managed by NetworkManager 28023 1726853619.66513: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853619.66517: getting variables 28023 1726853619.66518: in VariableManager get_vars() 28023 1726853619.66557: Calling all_inventory to load vars for managed_node3 28023 1726853619.66559: Calling groups_inventory to load vars for managed_node3 28023 1726853619.66562: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853619.66575: Calling all_plugins_play to load vars for managed_node3 28023 1726853619.66578: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853619.66583: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000300 28023 1726853619.66585: WORKER PROCESS EXITING 28023 1726853619.66597: Calling groups_plugins_play to load vars for managed_node3 28023 1726853619.66793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853619.67059: done with get_vars() 28023 1726853619.67070: done getting variables 28023 1726853619.67128: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 13:33:39 -0400 (0:00:01.076) 0:00:11.755 ****** 28023 1726853619.67163: entering _queue_task() for managed_node3/command 28023 1726853619.67458: worker is 1 (out of 1 available) 28023 1726853619.67482: exiting _queue_task() for managed_node3/command 28023 1726853619.67496: done queuing things up, now waiting for results queue to drain 28023 1726853619.67497: waiting for pending results... 28023 1726853619.67724: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 28023 1726853619.67837: in run() - task 02083763-bbaf-fdb6-dad7-000000000301 28023 1726853619.67860: variable 'ansible_search_path' from source: unknown 28023 1726853619.67873: variable 'ansible_search_path' from source: unknown 28023 1726853619.67915: calling self._execute() 28023 1726853619.68008: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853619.68077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853619.68081: variable 'omit' from source: magic vars 28023 1726853619.68408: variable 'ansible_distribution_major_version' from source: facts 28023 1726853619.68427: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853619.68593: variable 'type' from source: set_fact 28023 1726853619.68605: variable 'state' from source: include params 28023 1726853619.68617: Evaluated conditional (type == 'veth' and state == 'present'): True 28023 1726853619.68629: variable 'omit' from source: magic vars 28023 1726853619.68677: variable 'omit' from source: magic vars 28023 1726853619.68782: variable 'interface' from source: set_fact 28023 1726853619.68876: variable 'omit' from source: magic vars 28023 1726853619.68880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853619.68894: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853619.68919: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853619.68943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853619.68991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853619.69049: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853619.69068: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853619.69103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853619.69677: Set connection var ansible_shell_type to sh 28023 1726853619.69680: Set connection var ansible_shell_executable to /bin/sh 28023 1726853619.69683: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853619.69685: Set connection var ansible_connection to ssh 28023 1726853619.69687: Set connection var ansible_pipelining to False 28023 1726853619.69689: Set connection var ansible_timeout to 10 28023 1726853619.69692: variable 'ansible_shell_executable' from source: unknown 28023 1726853619.69695: variable 'ansible_connection' from source: unknown 28023 1726853619.69697: variable 'ansible_module_compression' from source: unknown 28023 1726853619.69700: variable 'ansible_shell_type' from source: unknown 28023 1726853619.69703: variable 'ansible_shell_executable' from source: unknown 28023 1726853619.69705: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853619.69708: variable 'ansible_pipelining' from source: unknown 28023 1726853619.69711: variable 'ansible_timeout' from source: unknown 28023 1726853619.69714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853619.69789: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853619.70077: variable 'omit' from source: magic vars 28023 1726853619.70080: starting attempt loop 28023 1726853619.70083: running the handler 28023 1726853619.70085: _low_level_execute_command(): starting 28023 1726853619.70087: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853619.71149: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853619.71167: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853619.71278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853619.71299: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853619.71318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853619.71416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853619.73118: stdout chunk (state=3): >>>/root <<< 28023 1726853619.73216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853619.73240: stderr chunk (state=3): >>><<< 28023 1726853619.73244: stdout chunk (state=3): >>><<< 28023 1726853619.73267: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853619.73280: _low_level_execute_command(): starting 28023 1726853619.73287: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853619.7326655-28641-190593592823217 `" && echo ansible-tmp-1726853619.7326655-28641-190593592823217="` echo /root/.ansible/tmp/ansible-tmp-1726853619.7326655-28641-190593592823217 `" ) && sleep 0' 28023 1726853619.73724: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853619.73727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853619.73740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853619.73742: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853619.73745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853619.73816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853619.73859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853619.75839: stdout chunk (state=3): >>>ansible-tmp-1726853619.7326655-28641-190593592823217=/root/.ansible/tmp/ansible-tmp-1726853619.7326655-28641-190593592823217 <<< 28023 1726853619.75948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853619.75978: stderr chunk (state=3): >>><<< 28023 1726853619.75981: stdout chunk (state=3): >>><<< 28023 1726853619.75995: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853619.7326655-28641-190593592823217=/root/.ansible/tmp/ansible-tmp-1726853619.7326655-28641-190593592823217 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853619.76022: variable 'ansible_module_compression' from source: unknown 28023 1726853619.76061: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28023 1726853619.76096: variable 'ansible_facts' from source: unknown 28023 1726853619.76150: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853619.7326655-28641-190593592823217/AnsiballZ_command.py 28023 1726853619.76252: Sending initial data 28023 1726853619.76256: Sent initial data (156 bytes) 28023 1726853619.76685: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853619.76688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853619.76690: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853619.76693: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853619.76695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853619.76753: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853619.76755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853619.76810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853619.78394: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28023 1726853619.78397: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853619.78449: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853619.78507: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpb5p8bu0j /root/.ansible/tmp/ansible-tmp-1726853619.7326655-28641-190593592823217/AnsiballZ_command.py <<< 28023 1726853619.78510: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853619.7326655-28641-190593592823217/AnsiballZ_command.py" <<< 28023 1726853619.78561: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpb5p8bu0j" to remote "/root/.ansible/tmp/ansible-tmp-1726853619.7326655-28641-190593592823217/AnsiballZ_command.py" <<< 28023 1726853619.78569: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853619.7326655-28641-190593592823217/AnsiballZ_command.py" <<< 28023 1726853619.79160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853619.79197: stderr chunk (state=3): >>><<< 28023 1726853619.79200: stdout chunk (state=3): >>><<< 28023 1726853619.79240: done transferring module to remote 28023 1726853619.79248: _low_level_execute_command(): starting 28023 1726853619.79252: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853619.7326655-28641-190593592823217/ /root/.ansible/tmp/ansible-tmp-1726853619.7326655-28641-190593592823217/AnsiballZ_command.py && sleep 0' 28023 1726853619.79677: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853619.79681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853619.79702: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28023 1726853619.79706: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853619.79746: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853619.79749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853619.79816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853619.81638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853619.81664: stderr chunk (state=3): >>><<< 28023 1726853619.81667: stdout chunk (state=3): >>><<< 28023 1726853619.81682: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853619.81685: _low_level_execute_command(): starting 28023 1726853619.81690: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853619.7326655-28641-190593592823217/AnsiballZ_command.py && sleep 0' 28023 1726853619.82113: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853619.82116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853619.82118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853619.82120: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853619.82122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853619.82176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853619.82183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853619.82244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853619.99777: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest1", "managed", "true"], "start": "2024-09-20 13:33:39.976129", "end": "2024-09-20 13:33:39.995995", "delta": "0:00:00.019866", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest1 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28023 1726853620.01497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853620.01524: stderr chunk (state=3): >>><<< 28023 1726853620.01527: stdout chunk (state=3): >>><<< 28023 1726853620.01549: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest1", "managed", "true"], "start": "2024-09-20 13:33:39.976129", "end": "2024-09-20 13:33:39.995995", "delta": "0:00:00.019866", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest1 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853620.01580: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest1 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853619.7326655-28641-190593592823217/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853620.01584: _low_level_execute_command(): starting 28023 1726853620.01589: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853619.7326655-28641-190593592823217/ > /dev/null 2>&1 && sleep 0' 28023 1726853620.02040: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853620.02043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853620.02046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853620.02048: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853620.02055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853620.02112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853620.02117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853620.02119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853620.02179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853620.04047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853620.04079: stderr chunk (state=3): >>><<< 28023 1726853620.04082: stdout chunk (state=3): >>><<< 28023 1726853620.04095: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853620.04101: handler run complete 28023 1726853620.04120: Evaluated conditional (False): False 28023 1726853620.04128: attempt loop complete, returning result 28023 1726853620.04131: _execute() done 28023 1726853620.04133: dumping result to json 28023 1726853620.04139: done dumping result, returning 28023 1726853620.04146: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [02083763-bbaf-fdb6-dad7-000000000301] 28023 1726853620.04150: sending task result for task 02083763-bbaf-fdb6-dad7-000000000301 28023 1726853620.04249: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000301 28023 1726853620.04252: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest1", "managed", "true" ], "delta": "0:00:00.019866", "end": "2024-09-20 13:33:39.995995", "rc": 0, "start": "2024-09-20 13:33:39.976129" } 28023 1726853620.04325: no more pending results, returning what we have 28023 1726853620.04328: results queue empty 28023 1726853620.04329: checking for any_errors_fatal 28023 1726853620.04341: done checking for any_errors_fatal 28023 1726853620.04341: checking for max_fail_percentage 28023 1726853620.04343: done checking for max_fail_percentage 28023 1726853620.04344: checking to see if all hosts have failed and the running result is not ok 28023 1726853620.04345: done checking to see if all hosts have failed 28023 1726853620.04345: getting the remaining hosts for this loop 28023 1726853620.04347: done getting the remaining hosts for this loop 28023 1726853620.04350: getting the next task for host managed_node3 28023 1726853620.04359: done getting next task for host managed_node3 28023 1726853620.04364: ^ task is: TASK: Delete veth interface {{ interface }} 28023 1726853620.04367: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853620.04370: getting variables 28023 1726853620.04373: in VariableManager get_vars() 28023 1726853620.04410: Calling all_inventory to load vars for managed_node3 28023 1726853620.04413: Calling groups_inventory to load vars for managed_node3 28023 1726853620.04415: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853620.04425: Calling all_plugins_play to load vars for managed_node3 28023 1726853620.04427: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853620.04429: Calling groups_plugins_play to load vars for managed_node3 28023 1726853620.04569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853620.04706: done with get_vars() 28023 1726853620.04715: done getting variables 28023 1726853620.04758: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853620.04844: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest1] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 13:33:40 -0400 (0:00:00.377) 0:00:12.132 ****** 28023 1726853620.04868: entering _queue_task() for managed_node3/command 28023 1726853620.05086: worker is 1 (out of 1 available) 28023 1726853620.05100: exiting _queue_task() for managed_node3/command 28023 1726853620.05114: done queuing things up, now waiting for results queue to drain 28023 1726853620.05117: waiting for pending results... 28023 1726853620.05279: running TaskExecutor() for managed_node3/TASK: Delete veth interface ethtest1 28023 1726853620.05342: in run() - task 02083763-bbaf-fdb6-dad7-000000000302 28023 1726853620.05355: variable 'ansible_search_path' from source: unknown 28023 1726853620.05362: variable 'ansible_search_path' from source: unknown 28023 1726853620.05388: calling self._execute() 28023 1726853620.05463: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853620.05470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853620.05474: variable 'omit' from source: magic vars 28023 1726853620.05721: variable 'ansible_distribution_major_version' from source: facts 28023 1726853620.05731: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853620.05863: variable 'type' from source: set_fact 28023 1726853620.05866: variable 'state' from source: include params 28023 1726853620.05869: variable 'interface' from source: set_fact 28023 1726853620.05874: variable 'current_interfaces' from source: set_fact 28023 1726853620.05883: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 28023 1726853620.05886: when evaluation is False, skipping this task 28023 1726853620.05890: _execute() done 28023 1726853620.05893: dumping result to json 28023 1726853620.05897: done dumping result, returning 28023 1726853620.05907: done running TaskExecutor() for managed_node3/TASK: Delete veth interface ethtest1 [02083763-bbaf-fdb6-dad7-000000000302] 28023 1726853620.05910: sending task result for task 02083763-bbaf-fdb6-dad7-000000000302 28023 1726853620.05988: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000302 28023 1726853620.05991: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 28023 1726853620.06054: no more pending results, returning what we have 28023 1726853620.06060: results queue empty 28023 1726853620.06061: checking for any_errors_fatal 28023 1726853620.06069: done checking for any_errors_fatal 28023 1726853620.06070: checking for max_fail_percentage 28023 1726853620.06076: done checking for max_fail_percentage 28023 1726853620.06077: checking to see if all hosts have failed and the running result is not ok 28023 1726853620.06078: done checking to see if all hosts have failed 28023 1726853620.06079: getting the remaining hosts for this loop 28023 1726853620.06080: done getting the remaining hosts for this loop 28023 1726853620.06083: getting the next task for host managed_node3 28023 1726853620.06089: done getting next task for host managed_node3 28023 1726853620.06091: ^ task is: TASK: Create dummy interface {{ interface }} 28023 1726853620.06094: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853620.06098: getting variables 28023 1726853620.06099: in VariableManager get_vars() 28023 1726853620.06134: Calling all_inventory to load vars for managed_node3 28023 1726853620.06136: Calling groups_inventory to load vars for managed_node3 28023 1726853620.06138: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853620.06147: Calling all_plugins_play to load vars for managed_node3 28023 1726853620.06149: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853620.06152: Calling groups_plugins_play to load vars for managed_node3 28023 1726853620.06279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853620.06442: done with get_vars() 28023 1726853620.06449: done getting variables 28023 1726853620.06493: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853620.06570: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest1] ***************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 13:33:40 -0400 (0:00:00.017) 0:00:12.149 ****** 28023 1726853620.06594: entering _queue_task() for managed_node3/command 28023 1726853620.06811: worker is 1 (out of 1 available) 28023 1726853620.06824: exiting _queue_task() for managed_node3/command 28023 1726853620.06838: done queuing things up, now waiting for results queue to drain 28023 1726853620.06839: waiting for pending results... 28023 1726853620.07036: running TaskExecutor() for managed_node3/TASK: Create dummy interface ethtest1 28023 1726853620.07108: in run() - task 02083763-bbaf-fdb6-dad7-000000000303 28023 1726853620.07119: variable 'ansible_search_path' from source: unknown 28023 1726853620.07123: variable 'ansible_search_path' from source: unknown 28023 1726853620.07150: calling self._execute() 28023 1726853620.07220: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853620.07224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853620.07232: variable 'omit' from source: magic vars 28023 1726853620.07490: variable 'ansible_distribution_major_version' from source: facts 28023 1726853620.07500: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853620.07630: variable 'type' from source: set_fact 28023 1726853620.07634: variable 'state' from source: include params 28023 1726853620.07638: variable 'interface' from source: set_fact 28023 1726853620.07643: variable 'current_interfaces' from source: set_fact 28023 1726853620.07651: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 28023 1726853620.07655: when evaluation is False, skipping this task 28023 1726853620.07657: _execute() done 28023 1726853620.07663: dumping result to json 28023 1726853620.07665: done dumping result, returning 28023 1726853620.07674: done running TaskExecutor() for managed_node3/TASK: Create dummy interface ethtest1 [02083763-bbaf-fdb6-dad7-000000000303] 28023 1726853620.07677: sending task result for task 02083763-bbaf-fdb6-dad7-000000000303 28023 1726853620.07753: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000303 28023 1726853620.07756: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 28023 1726853620.07807: no more pending results, returning what we have 28023 1726853620.07810: results queue empty 28023 1726853620.07811: checking for any_errors_fatal 28023 1726853620.07816: done checking for any_errors_fatal 28023 1726853620.07817: checking for max_fail_percentage 28023 1726853620.07818: done checking for max_fail_percentage 28023 1726853620.07819: checking to see if all hosts have failed and the running result is not ok 28023 1726853620.07820: done checking to see if all hosts have failed 28023 1726853620.07821: getting the remaining hosts for this loop 28023 1726853620.07822: done getting the remaining hosts for this loop 28023 1726853620.07826: getting the next task for host managed_node3 28023 1726853620.07831: done getting next task for host managed_node3 28023 1726853620.07835: ^ task is: TASK: Delete dummy interface {{ interface }} 28023 1726853620.07838: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853620.07841: getting variables 28023 1726853620.07843: in VariableManager get_vars() 28023 1726853620.07880: Calling all_inventory to load vars for managed_node3 28023 1726853620.07883: Calling groups_inventory to load vars for managed_node3 28023 1726853620.07885: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853620.07894: Calling all_plugins_play to load vars for managed_node3 28023 1726853620.07896: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853620.07898: Calling groups_plugins_play to load vars for managed_node3 28023 1726853620.08021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853620.08152: done with get_vars() 28023 1726853620.08159: done getting variables 28023 1726853620.08200: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853620.08279: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest1] ***************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 13:33:40 -0400 (0:00:00.017) 0:00:12.166 ****** 28023 1726853620.08299: entering _queue_task() for managed_node3/command 28023 1726853620.08484: worker is 1 (out of 1 available) 28023 1726853620.08496: exiting _queue_task() for managed_node3/command 28023 1726853620.08510: done queuing things up, now waiting for results queue to drain 28023 1726853620.08511: waiting for pending results... 28023 1726853620.08788: running TaskExecutor() for managed_node3/TASK: Delete dummy interface ethtest1 28023 1726853620.08801: in run() - task 02083763-bbaf-fdb6-dad7-000000000304 28023 1726853620.08819: variable 'ansible_search_path' from source: unknown 28023 1726853620.08827: variable 'ansible_search_path' from source: unknown 28023 1726853620.08867: calling self._execute() 28023 1726853620.08954: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853620.08977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853620.09076: variable 'omit' from source: magic vars 28023 1726853620.09332: variable 'ansible_distribution_major_version' from source: facts 28023 1726853620.09349: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853620.09548: variable 'type' from source: set_fact 28023 1726853620.09565: variable 'state' from source: include params 28023 1726853620.09576: variable 'interface' from source: set_fact 28023 1726853620.09585: variable 'current_interfaces' from source: set_fact 28023 1726853620.09597: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 28023 1726853620.09605: when evaluation is False, skipping this task 28023 1726853620.09611: _execute() done 28023 1726853620.09665: dumping result to json 28023 1726853620.09668: done dumping result, returning 28023 1726853620.09670: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface ethtest1 [02083763-bbaf-fdb6-dad7-000000000304] 28023 1726853620.09676: sending task result for task 02083763-bbaf-fdb6-dad7-000000000304 28023 1726853620.09737: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000304 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 28023 1726853620.09816: no more pending results, returning what we have 28023 1726853620.09820: results queue empty 28023 1726853620.09821: checking for any_errors_fatal 28023 1726853620.09828: done checking for any_errors_fatal 28023 1726853620.09829: checking for max_fail_percentage 28023 1726853620.09830: done checking for max_fail_percentage 28023 1726853620.09831: checking to see if all hosts have failed and the running result is not ok 28023 1726853620.09832: done checking to see if all hosts have failed 28023 1726853620.09834: getting the remaining hosts for this loop 28023 1726853620.09835: done getting the remaining hosts for this loop 28023 1726853620.09840: getting the next task for host managed_node3 28023 1726853620.09847: done getting next task for host managed_node3 28023 1726853620.09858: ^ task is: TASK: Create tap interface {{ interface }} 28023 1726853620.09863: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853620.09868: getting variables 28023 1726853620.09872: in VariableManager get_vars() 28023 1726853620.09915: Calling all_inventory to load vars for managed_node3 28023 1726853620.09918: Calling groups_inventory to load vars for managed_node3 28023 1726853620.09920: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853620.09933: Calling all_plugins_play to load vars for managed_node3 28023 1726853620.09935: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853620.09938: Calling groups_plugins_play to load vars for managed_node3 28023 1726853620.10150: WORKER PROCESS EXITING 28023 1726853620.10163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853620.10290: done with get_vars() 28023 1726853620.10297: done getting variables 28023 1726853620.10337: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853620.10412: variable 'interface' from source: set_fact TASK [Create tap interface ethtest1] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 13:33:40 -0400 (0:00:00.021) 0:00:12.188 ****** 28023 1726853620.10434: entering _queue_task() for managed_node3/command 28023 1726853620.10628: worker is 1 (out of 1 available) 28023 1726853620.10641: exiting _queue_task() for managed_node3/command 28023 1726853620.10655: done queuing things up, now waiting for results queue to drain 28023 1726853620.10659: waiting for pending results... 28023 1726853620.10809: running TaskExecutor() for managed_node3/TASK: Create tap interface ethtest1 28023 1726853620.10874: in run() - task 02083763-bbaf-fdb6-dad7-000000000305 28023 1726853620.10891: variable 'ansible_search_path' from source: unknown 28023 1726853620.10895: variable 'ansible_search_path' from source: unknown 28023 1726853620.10918: calling self._execute() 28023 1726853620.10985: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853620.10990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853620.11001: variable 'omit' from source: magic vars 28023 1726853620.11246: variable 'ansible_distribution_major_version' from source: facts 28023 1726853620.11255: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853620.11383: variable 'type' from source: set_fact 28023 1726853620.11387: variable 'state' from source: include params 28023 1726853620.11390: variable 'interface' from source: set_fact 28023 1726853620.11396: variable 'current_interfaces' from source: set_fact 28023 1726853620.11404: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 28023 1726853620.11407: when evaluation is False, skipping this task 28023 1726853620.11409: _execute() done 28023 1726853620.11411: dumping result to json 28023 1726853620.11415: done dumping result, returning 28023 1726853620.11422: done running TaskExecutor() for managed_node3/TASK: Create tap interface ethtest1 [02083763-bbaf-fdb6-dad7-000000000305] 28023 1726853620.11428: sending task result for task 02083763-bbaf-fdb6-dad7-000000000305 28023 1726853620.11509: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000305 28023 1726853620.11512: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 28023 1726853620.11586: no more pending results, returning what we have 28023 1726853620.11589: results queue empty 28023 1726853620.11590: checking for any_errors_fatal 28023 1726853620.11596: done checking for any_errors_fatal 28023 1726853620.11596: checking for max_fail_percentage 28023 1726853620.11598: done checking for max_fail_percentage 28023 1726853620.11599: checking to see if all hosts have failed and the running result is not ok 28023 1726853620.11600: done checking to see if all hosts have failed 28023 1726853620.11601: getting the remaining hosts for this loop 28023 1726853620.11602: done getting the remaining hosts for this loop 28023 1726853620.11604: getting the next task for host managed_node3 28023 1726853620.11609: done getting next task for host managed_node3 28023 1726853620.11611: ^ task is: TASK: Delete tap interface {{ interface }} 28023 1726853620.11614: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853620.11617: getting variables 28023 1726853620.11618: in VariableManager get_vars() 28023 1726853620.11664: Calling all_inventory to load vars for managed_node3 28023 1726853620.11667: Calling groups_inventory to load vars for managed_node3 28023 1726853620.11670: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853620.11683: Calling all_plugins_play to load vars for managed_node3 28023 1726853620.11686: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853620.11689: Calling groups_plugins_play to load vars for managed_node3 28023 1726853620.11887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853620.12118: done with get_vars() 28023 1726853620.12128: done getting variables 28023 1726853620.12187: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853620.12299: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest1] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 13:33:40 -0400 (0:00:00.018) 0:00:12.207 ****** 28023 1726853620.12333: entering _queue_task() for managed_node3/command 28023 1726853620.12682: worker is 1 (out of 1 available) 28023 1726853620.12693: exiting _queue_task() for managed_node3/command 28023 1726853620.12705: done queuing things up, now waiting for results queue to drain 28023 1726853620.12706: waiting for pending results... 28023 1726853620.12987: running TaskExecutor() for managed_node3/TASK: Delete tap interface ethtest1 28023 1726853620.13006: in run() - task 02083763-bbaf-fdb6-dad7-000000000306 28023 1726853620.13024: variable 'ansible_search_path' from source: unknown 28023 1726853620.13031: variable 'ansible_search_path' from source: unknown 28023 1726853620.13075: calling self._execute() 28023 1726853620.13170: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853620.13188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853620.13201: variable 'omit' from source: magic vars 28023 1726853620.13584: variable 'ansible_distribution_major_version' from source: facts 28023 1726853620.13621: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853620.13820: variable 'type' from source: set_fact 28023 1726853620.13830: variable 'state' from source: include params 28023 1726853620.13862: variable 'interface' from source: set_fact 28023 1726853620.13865: variable 'current_interfaces' from source: set_fact 28023 1726853620.13869: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 28023 1726853620.13878: when evaluation is False, skipping this task 28023 1726853620.13884: _execute() done 28023 1726853620.13892: dumping result to json 28023 1726853620.13948: done dumping result, returning 28023 1726853620.13952: done running TaskExecutor() for managed_node3/TASK: Delete tap interface ethtest1 [02083763-bbaf-fdb6-dad7-000000000306] 28023 1726853620.13954: sending task result for task 02083763-bbaf-fdb6-dad7-000000000306 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 28023 1726853620.14108: no more pending results, returning what we have 28023 1726853620.14111: results queue empty 28023 1726853620.14112: checking for any_errors_fatal 28023 1726853620.14118: done checking for any_errors_fatal 28023 1726853620.14118: checking for max_fail_percentage 28023 1726853620.14120: done checking for max_fail_percentage 28023 1726853620.14121: checking to see if all hosts have failed and the running result is not ok 28023 1726853620.14122: done checking to see if all hosts have failed 28023 1726853620.14123: getting the remaining hosts for this loop 28023 1726853620.14125: done getting the remaining hosts for this loop 28023 1726853620.14128: getting the next task for host managed_node3 28023 1726853620.14142: done getting next task for host managed_node3 28023 1726853620.14146: ^ task is: TASK: Assert device is present 28023 1726853620.14149: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853620.14154: getting variables 28023 1726853620.14155: in VariableManager get_vars() 28023 1726853620.14314: Calling all_inventory to load vars for managed_node3 28023 1726853620.14317: Calling groups_inventory to load vars for managed_node3 28023 1726853620.14319: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853620.14388: Calling all_plugins_play to load vars for managed_node3 28023 1726853620.14391: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853620.14395: Calling groups_plugins_play to load vars for managed_node3 28023 1726853620.14694: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000306 28023 1726853620.14697: WORKER PROCESS EXITING 28023 1726853620.14723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853620.14954: done with get_vars() 28023 1726853620.14966: done getting variables TASK [Assert device is present] ************************************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:32 Friday 20 September 2024 13:33:40 -0400 (0:00:00.027) 0:00:12.234 ****** 28023 1726853620.15055: entering _queue_task() for managed_node3/include_tasks 28023 1726853620.15310: worker is 1 (out of 1 available) 28023 1726853620.15322: exiting _queue_task() for managed_node3/include_tasks 28023 1726853620.15335: done queuing things up, now waiting for results queue to drain 28023 1726853620.15336: waiting for pending results... 28023 1726853620.15613: running TaskExecutor() for managed_node3/TASK: Assert device is present 28023 1726853620.15713: in run() - task 02083763-bbaf-fdb6-dad7-000000000012 28023 1726853620.15735: variable 'ansible_search_path' from source: unknown 28023 1726853620.15796: calling self._execute() 28023 1726853620.15894: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853620.15976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853620.15979: variable 'omit' from source: magic vars 28023 1726853620.16304: variable 'ansible_distribution_major_version' from source: facts 28023 1726853620.16322: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853620.16340: _execute() done 28023 1726853620.16352: dumping result to json 28023 1726853620.16363: done dumping result, returning 28023 1726853620.16376: done running TaskExecutor() for managed_node3/TASK: Assert device is present [02083763-bbaf-fdb6-dad7-000000000012] 28023 1726853620.16386: sending task result for task 02083763-bbaf-fdb6-dad7-000000000012 28023 1726853620.16609: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000012 28023 1726853620.16611: WORKER PROCESS EXITING 28023 1726853620.16640: no more pending results, returning what we have 28023 1726853620.16644: in VariableManager get_vars() 28023 1726853620.16695: Calling all_inventory to load vars for managed_node3 28023 1726853620.16698: Calling groups_inventory to load vars for managed_node3 28023 1726853620.16700: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853620.16715: Calling all_plugins_play to load vars for managed_node3 28023 1726853620.16717: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853620.16724: Calling groups_plugins_play to load vars for managed_node3 28023 1726853620.17004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853620.17311: done with get_vars() 28023 1726853620.17320: variable 'ansible_search_path' from source: unknown 28023 1726853620.17334: we have included files to process 28023 1726853620.17335: generating all_blocks data 28023 1726853620.17337: done generating all_blocks data 28023 1726853620.17343: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 28023 1726853620.17345: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 28023 1726853620.17347: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 28023 1726853620.17467: in VariableManager get_vars() 28023 1726853620.17497: done with get_vars() 28023 1726853620.17617: done processing included file 28023 1726853620.17620: iterating over new_blocks loaded from include file 28023 1726853620.17621: in VariableManager get_vars() 28023 1726853620.17639: done with get_vars() 28023 1726853620.17640: filtering new block on tags 28023 1726853620.17662: done filtering new block on tags 28023 1726853620.17665: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 28023 1726853620.17670: extending task lists for all hosts with included blocks 28023 1726853620.18992: done extending task lists 28023 1726853620.18994: done processing included files 28023 1726853620.18995: results queue empty 28023 1726853620.18996: checking for any_errors_fatal 28023 1726853620.18999: done checking for any_errors_fatal 28023 1726853620.19000: checking for max_fail_percentage 28023 1726853620.19001: done checking for max_fail_percentage 28023 1726853620.19001: checking to see if all hosts have failed and the running result is not ok 28023 1726853620.19002: done checking to see if all hosts have failed 28023 1726853620.19003: getting the remaining hosts for this loop 28023 1726853620.19009: done getting the remaining hosts for this loop 28023 1726853620.19012: getting the next task for host managed_node3 28023 1726853620.19016: done getting next task for host managed_node3 28023 1726853620.19018: ^ task is: TASK: Include the task 'get_interface_stat.yml' 28023 1726853620.19021: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853620.19024: getting variables 28023 1726853620.19025: in VariableManager get_vars() 28023 1726853620.19040: Calling all_inventory to load vars for managed_node3 28023 1726853620.19043: Calling groups_inventory to load vars for managed_node3 28023 1726853620.19045: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853620.19050: Calling all_plugins_play to load vars for managed_node3 28023 1726853620.19053: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853620.19056: Calling groups_plugins_play to load vars for managed_node3 28023 1726853620.19463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853620.19678: done with get_vars() 28023 1726853620.19687: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:33:40 -0400 (0:00:00.046) 0:00:12.281 ****** 28023 1726853620.19751: entering _queue_task() for managed_node3/include_tasks 28023 1726853620.20063: worker is 1 (out of 1 available) 28023 1726853620.20078: exiting _queue_task() for managed_node3/include_tasks 28023 1726853620.20204: done queuing things up, now waiting for results queue to drain 28023 1726853620.20206: waiting for pending results... 28023 1726853620.20483: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 28023 1726853620.20532: in run() - task 02083763-bbaf-fdb6-dad7-0000000003eb 28023 1726853620.20553: variable 'ansible_search_path' from source: unknown 28023 1726853620.20564: variable 'ansible_search_path' from source: unknown 28023 1726853620.20612: calling self._execute() 28023 1726853620.20717: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853620.20749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853620.20752: variable 'omit' from source: magic vars 28023 1726853620.21146: variable 'ansible_distribution_major_version' from source: facts 28023 1726853620.21184: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853620.21188: _execute() done 28023 1726853620.21190: dumping result to json 28023 1726853620.21196: done dumping result, returning 28023 1726853620.21252: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-fdb6-dad7-0000000003eb] 28023 1726853620.21255: sending task result for task 02083763-bbaf-fdb6-dad7-0000000003eb 28023 1726853620.21402: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000003eb 28023 1726853620.21406: WORKER PROCESS EXITING 28023 1726853620.21436: no more pending results, returning what we have 28023 1726853620.21441: in VariableManager get_vars() 28023 1726853620.21496: Calling all_inventory to load vars for managed_node3 28023 1726853620.21499: Calling groups_inventory to load vars for managed_node3 28023 1726853620.21502: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853620.21589: Calling all_plugins_play to load vars for managed_node3 28023 1726853620.21593: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853620.21596: Calling groups_plugins_play to load vars for managed_node3 28023 1726853620.21959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853620.22195: done with get_vars() 28023 1726853620.22203: variable 'ansible_search_path' from source: unknown 28023 1726853620.22205: variable 'ansible_search_path' from source: unknown 28023 1726853620.22243: we have included files to process 28023 1726853620.22244: generating all_blocks data 28023 1726853620.22246: done generating all_blocks data 28023 1726853620.22247: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28023 1726853620.22248: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28023 1726853620.22251: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28023 1726853620.22443: done processing included file 28023 1726853620.22445: iterating over new_blocks loaded from include file 28023 1726853620.22447: in VariableManager get_vars() 28023 1726853620.22469: done with get_vars() 28023 1726853620.22472: filtering new block on tags 28023 1726853620.22492: done filtering new block on tags 28023 1726853620.22494: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 28023 1726853620.22499: extending task lists for all hosts with included blocks 28023 1726853620.22610: done extending task lists 28023 1726853620.22611: done processing included files 28023 1726853620.22612: results queue empty 28023 1726853620.22613: checking for any_errors_fatal 28023 1726853620.22616: done checking for any_errors_fatal 28023 1726853620.22617: checking for max_fail_percentage 28023 1726853620.22618: done checking for max_fail_percentage 28023 1726853620.22619: checking to see if all hosts have failed and the running result is not ok 28023 1726853620.22620: done checking to see if all hosts have failed 28023 1726853620.22620: getting the remaining hosts for this loop 28023 1726853620.22622: done getting the remaining hosts for this loop 28023 1726853620.22624: getting the next task for host managed_node3 28023 1726853620.22628: done getting next task for host managed_node3 28023 1726853620.22630: ^ task is: TASK: Get stat for interface {{ interface }} 28023 1726853620.22635: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853620.22637: getting variables 28023 1726853620.22638: in VariableManager get_vars() 28023 1726853620.22652: Calling all_inventory to load vars for managed_node3 28023 1726853620.22655: Calling groups_inventory to load vars for managed_node3 28023 1726853620.22659: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853620.22664: Calling all_plugins_play to load vars for managed_node3 28023 1726853620.22667: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853620.22670: Calling groups_plugins_play to load vars for managed_node3 28023 1726853620.22835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853620.23045: done with get_vars() 28023 1726853620.23055: done getting variables 28023 1726853620.23207: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest1] ***************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:33:40 -0400 (0:00:00.034) 0:00:12.316 ****** 28023 1726853620.23236: entering _queue_task() for managed_node3/stat 28023 1726853620.23547: worker is 1 (out of 1 available) 28023 1726853620.23560: exiting _queue_task() for managed_node3/stat 28023 1726853620.23675: done queuing things up, now waiting for results queue to drain 28023 1726853620.23678: waiting for pending results... 28023 1726853620.23918: running TaskExecutor() for managed_node3/TASK: Get stat for interface ethtest1 28023 1726853620.23988: in run() - task 02083763-bbaf-fdb6-dad7-000000000483 28023 1726853620.24013: variable 'ansible_search_path' from source: unknown 28023 1726853620.24024: variable 'ansible_search_path' from source: unknown 28023 1726853620.24069: calling self._execute() 28023 1726853620.24173: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853620.24233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853620.24238: variable 'omit' from source: magic vars 28023 1726853620.24594: variable 'ansible_distribution_major_version' from source: facts 28023 1726853620.24667: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853620.24672: variable 'omit' from source: magic vars 28023 1726853620.24675: variable 'omit' from source: magic vars 28023 1726853620.24781: variable 'interface' from source: set_fact 28023 1726853620.24808: variable 'omit' from source: magic vars 28023 1726853620.24885: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853620.24906: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853620.24931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853620.24951: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853620.24969: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853620.25011: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853620.25175: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853620.25179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853620.25181: Set connection var ansible_shell_type to sh 28023 1726853620.25183: Set connection var ansible_shell_executable to /bin/sh 28023 1726853620.25185: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853620.25186: Set connection var ansible_connection to ssh 28023 1726853620.25188: Set connection var ansible_pipelining to False 28023 1726853620.25190: Set connection var ansible_timeout to 10 28023 1726853620.25192: variable 'ansible_shell_executable' from source: unknown 28023 1726853620.25194: variable 'ansible_connection' from source: unknown 28023 1726853620.25203: variable 'ansible_module_compression' from source: unknown 28023 1726853620.25210: variable 'ansible_shell_type' from source: unknown 28023 1726853620.25216: variable 'ansible_shell_executable' from source: unknown 28023 1726853620.25222: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853620.25229: variable 'ansible_pipelining' from source: unknown 28023 1726853620.25235: variable 'ansible_timeout' from source: unknown 28023 1726853620.25241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853620.25451: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28023 1726853620.25470: variable 'omit' from source: magic vars 28023 1726853620.25482: starting attempt loop 28023 1726853620.25488: running the handler 28023 1726853620.25505: _low_level_execute_command(): starting 28023 1726853620.25516: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853620.26431: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853620.26450: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853620.26533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853620.26591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853620.26610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853620.26642: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853620.26750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853620.28474: stdout chunk (state=3): >>>/root <<< 28023 1726853620.28610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853620.28780: stderr chunk (state=3): >>><<< 28023 1726853620.28783: stdout chunk (state=3): >>><<< 28023 1726853620.28789: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853620.28791: _low_level_execute_command(): starting 28023 1726853620.28794: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853620.287055-28667-169727590511198 `" && echo ansible-tmp-1726853620.287055-28667-169727590511198="` echo /root/.ansible/tmp/ansible-tmp-1726853620.287055-28667-169727590511198 `" ) && sleep 0' 28023 1726853620.29709: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853620.29713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853620.29716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853620.29725: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853620.29728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853620.29793: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853620.29801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853620.29804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853620.29880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853620.31937: stdout chunk (state=3): >>>ansible-tmp-1726853620.287055-28667-169727590511198=/root/.ansible/tmp/ansible-tmp-1726853620.287055-28667-169727590511198 <<< 28023 1726853620.32377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853620.32381: stdout chunk (state=3): >>><<< 28023 1726853620.32383: stderr chunk (state=3): >>><<< 28023 1726853620.32385: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853620.287055-28667-169727590511198=/root/.ansible/tmp/ansible-tmp-1726853620.287055-28667-169727590511198 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853620.32388: variable 'ansible_module_compression' from source: unknown 28023 1726853620.32389: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28023 1726853620.32391: variable 'ansible_facts' from source: unknown 28023 1726853620.32565: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853620.287055-28667-169727590511198/AnsiballZ_stat.py 28023 1726853620.32798: Sending initial data 28023 1726853620.32809: Sent initial data (152 bytes) 28023 1726853620.33320: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853620.33334: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853620.33350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853620.33374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853620.33481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853620.33506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853620.33602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853620.35235: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853620.35293: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853620.35355: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmp0cow1a8m /root/.ansible/tmp/ansible-tmp-1726853620.287055-28667-169727590511198/AnsiballZ_stat.py <<< 28023 1726853620.35362: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853620.287055-28667-169727590511198/AnsiballZ_stat.py" <<< 28023 1726853620.35441: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmp0cow1a8m" to remote "/root/.ansible/tmp/ansible-tmp-1726853620.287055-28667-169727590511198/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853620.287055-28667-169727590511198/AnsiballZ_stat.py" <<< 28023 1726853620.36965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853620.36968: stderr chunk (state=3): >>><<< 28023 1726853620.36976: stdout chunk (state=3): >>><<< 28023 1726853620.36980: done transferring module to remote 28023 1726853620.36982: _low_level_execute_command(): starting 28023 1726853620.36984: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853620.287055-28667-169727590511198/ /root/.ansible/tmp/ansible-tmp-1726853620.287055-28667-169727590511198/AnsiballZ_stat.py && sleep 0' 28023 1726853620.38167: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853620.38170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853620.38176: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853620.38178: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853620.38180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853620.38317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853620.38416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853620.40287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853620.40360: stderr chunk (state=3): >>><<< 28023 1726853620.40363: stdout chunk (state=3): >>><<< 28023 1726853620.40379: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853620.40461: _low_level_execute_command(): starting 28023 1726853620.40465: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853620.287055-28667-169727590511198/AnsiballZ_stat.py && sleep 0' 28023 1726853620.40952: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853620.40965: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853620.40980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853620.40995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853620.41011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853620.41022: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853620.41036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853620.41091: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853620.41145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853620.41162: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853620.41189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853620.41289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853620.57115: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31462, "dev": 23, "nlink": 1, "atime": 1726853618.8931813, "mtime": 1726853618.8931813, "ctime": 1726853618.8931813, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest1", "lnk_target": "../../devices/virtual/net/ethtest1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28023 1726853620.58323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853620.58358: stderr chunk (state=3): >>><<< 28023 1726853620.58369: stdout chunk (state=3): >>><<< 28023 1726853620.58398: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31462, "dev": 23, "nlink": 1, "atime": 1726853618.8931813, "mtime": 1726853618.8931813, "ctime": 1726853618.8931813, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest1", "lnk_target": "../../devices/virtual/net/ethtest1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853620.58459: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853620.287055-28667-169727590511198/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853620.58482: _low_level_execute_command(): starting 28023 1726853620.58493: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853620.287055-28667-169727590511198/ > /dev/null 2>&1 && sleep 0' 28023 1726853620.59100: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853620.59116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853620.59134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853620.59155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853620.59177: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853620.59189: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853620.59202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853620.59220: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28023 1726853620.59233: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 28023 1726853620.59288: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853620.59335: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853620.59362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853620.59387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853620.59482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853620.61348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853620.61407: stderr chunk (state=3): >>><<< 28023 1726853620.61418: stdout chunk (state=3): >>><<< 28023 1726853620.61443: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853620.61455: handler run complete 28023 1726853620.61512: attempt loop complete, returning result 28023 1726853620.61520: _execute() done 28023 1726853620.61528: dumping result to json 28023 1726853620.61538: done dumping result, returning 28023 1726853620.61551: done running TaskExecutor() for managed_node3/TASK: Get stat for interface ethtest1 [02083763-bbaf-fdb6-dad7-000000000483] 28023 1726853620.61565: sending task result for task 02083763-bbaf-fdb6-dad7-000000000483 ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726853618.8931813, "block_size": 4096, "blocks": 0, "ctime": 1726853618.8931813, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 31462, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest1", "lnk_target": "../../devices/virtual/net/ethtest1", "mode": "0777", "mtime": 1726853618.8931813, "nlink": 1, "path": "/sys/class/net/ethtest1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 28023 1726853620.61779: no more pending results, returning what we have 28023 1726853620.61782: results queue empty 28023 1726853620.61783: checking for any_errors_fatal 28023 1726853620.61785: done checking for any_errors_fatal 28023 1726853620.61785: checking for max_fail_percentage 28023 1726853620.61787: done checking for max_fail_percentage 28023 1726853620.61788: checking to see if all hosts have failed and the running result is not ok 28023 1726853620.61789: done checking to see if all hosts have failed 28023 1726853620.61789: getting the remaining hosts for this loop 28023 1726853620.61791: done getting the remaining hosts for this loop 28023 1726853620.61794: getting the next task for host managed_node3 28023 1726853620.61807: done getting next task for host managed_node3 28023 1726853620.61809: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 28023 1726853620.61813: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853620.61816: getting variables 28023 1726853620.61817: in VariableManager get_vars() 28023 1726853620.61859: Calling all_inventory to load vars for managed_node3 28023 1726853620.61862: Calling groups_inventory to load vars for managed_node3 28023 1726853620.61864: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853620.62025: Calling all_plugins_play to load vars for managed_node3 28023 1726853620.62029: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853620.62034: Calling groups_plugins_play to load vars for managed_node3 28023 1726853620.62297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853620.62897: done with get_vars() 28023 1726853620.62909: done getting variables 28023 1726853620.62952: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000483 28023 1726853620.62955: WORKER PROCESS EXITING 28023 1726853620.62984: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853620.63214: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest1'] *********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:33:40 -0400 (0:00:00.400) 0:00:12.716 ****** 28023 1726853620.63251: entering _queue_task() for managed_node3/assert 28023 1726853620.63680: worker is 1 (out of 1 available) 28023 1726853620.63690: exiting _queue_task() for managed_node3/assert 28023 1726853620.63702: done queuing things up, now waiting for results queue to drain 28023 1726853620.63703: waiting for pending results... 28023 1726853620.64042: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'ethtest1' 28023 1726853620.64166: in run() - task 02083763-bbaf-fdb6-dad7-0000000003ec 28023 1726853620.64189: variable 'ansible_search_path' from source: unknown 28023 1726853620.64197: variable 'ansible_search_path' from source: unknown 28023 1726853620.64243: calling self._execute() 28023 1726853620.64345: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853620.64364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853620.64381: variable 'omit' from source: magic vars 28023 1726853620.64759: variable 'ansible_distribution_major_version' from source: facts 28023 1726853620.64779: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853620.64794: variable 'omit' from source: magic vars 28023 1726853620.64833: variable 'omit' from source: magic vars 28023 1726853620.64937: variable 'interface' from source: set_fact 28023 1726853620.64962: variable 'omit' from source: magic vars 28023 1726853620.65008: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853620.65050: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853620.65079: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853620.65103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853620.65124: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853620.65189: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853620.65198: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853620.65206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853620.65308: Set connection var ansible_shell_type to sh 28023 1726853620.65320: Set connection var ansible_shell_executable to /bin/sh 28023 1726853620.65352: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853620.65367: Set connection var ansible_connection to ssh 28023 1726853620.65383: Set connection var ansible_pipelining to False 28023 1726853620.65393: Set connection var ansible_timeout to 10 28023 1726853620.65449: variable 'ansible_shell_executable' from source: unknown 28023 1726853620.65453: variable 'ansible_connection' from source: unknown 28023 1726853620.65455: variable 'ansible_module_compression' from source: unknown 28023 1726853620.65460: variable 'ansible_shell_type' from source: unknown 28023 1726853620.65491: variable 'ansible_shell_executable' from source: unknown 28023 1726853620.65494: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853620.65496: variable 'ansible_pipelining' from source: unknown 28023 1726853620.65498: variable 'ansible_timeout' from source: unknown 28023 1726853620.65500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853620.65777: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853620.65780: variable 'omit' from source: magic vars 28023 1726853620.65782: starting attempt loop 28023 1726853620.65785: running the handler 28023 1726853620.65847: variable 'interface_stat' from source: set_fact 28023 1726853620.65880: Evaluated conditional (interface_stat.stat.exists): True 28023 1726853620.65894: handler run complete 28023 1726853620.65913: attempt loop complete, returning result 28023 1726853620.65920: _execute() done 28023 1726853620.65928: dumping result to json 28023 1726853620.65934: done dumping result, returning 28023 1726853620.65945: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'ethtest1' [02083763-bbaf-fdb6-dad7-0000000003ec] 28023 1726853620.65955: sending task result for task 02083763-bbaf-fdb6-dad7-0000000003ec 28023 1726853620.66176: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000003ec 28023 1726853620.66179: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 28023 1726853620.66228: no more pending results, returning what we have 28023 1726853620.66231: results queue empty 28023 1726853620.66232: checking for any_errors_fatal 28023 1726853620.66245: done checking for any_errors_fatal 28023 1726853620.66245: checking for max_fail_percentage 28023 1726853620.66247: done checking for max_fail_percentage 28023 1726853620.66248: checking to see if all hosts have failed and the running result is not ok 28023 1726853620.66249: done checking to see if all hosts have failed 28023 1726853620.66250: getting the remaining hosts for this loop 28023 1726853620.66251: done getting the remaining hosts for this loop 28023 1726853620.66255: getting the next task for host managed_node3 28023 1726853620.66267: done getting next task for host managed_node3 28023 1726853620.66276: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28023 1726853620.66280: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853620.66295: getting variables 28023 1726853620.66297: in VariableManager get_vars() 28023 1726853620.66337: Calling all_inventory to load vars for managed_node3 28023 1726853620.66340: Calling groups_inventory to load vars for managed_node3 28023 1726853620.66343: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853620.66353: Calling all_plugins_play to load vars for managed_node3 28023 1726853620.66358: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853620.66362: Calling groups_plugins_play to load vars for managed_node3 28023 1726853620.66653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853620.66933: done with get_vars() 28023 1726853620.66944: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:33:40 -0400 (0:00:00.037) 0:00:12.754 ****** 28023 1726853620.67037: entering _queue_task() for managed_node3/include_tasks 28023 1726853620.67493: worker is 1 (out of 1 available) 28023 1726853620.67501: exiting _queue_task() for managed_node3/include_tasks 28023 1726853620.67512: done queuing things up, now waiting for results queue to drain 28023 1726853620.67513: waiting for pending results... 28023 1726853620.67633: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28023 1726853620.67778: in run() - task 02083763-bbaf-fdb6-dad7-00000000001b 28023 1726853620.67805: variable 'ansible_search_path' from source: unknown 28023 1726853620.67812: variable 'ansible_search_path' from source: unknown 28023 1726853620.67855: calling self._execute() 28023 1726853620.67954: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853620.67970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853620.67986: variable 'omit' from source: magic vars 28023 1726853620.68343: variable 'ansible_distribution_major_version' from source: facts 28023 1726853620.68363: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853620.68376: _execute() done 28023 1726853620.68440: dumping result to json 28023 1726853620.68443: done dumping result, returning 28023 1726853620.68445: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-fdb6-dad7-00000000001b] 28023 1726853620.68447: sending task result for task 02083763-bbaf-fdb6-dad7-00000000001b 28023 1726853620.68516: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000001b 28023 1726853620.68518: WORKER PROCESS EXITING 28023 1726853620.68586: no more pending results, returning what we have 28023 1726853620.68591: in VariableManager get_vars() 28023 1726853620.68637: Calling all_inventory to load vars for managed_node3 28023 1726853620.68641: Calling groups_inventory to load vars for managed_node3 28023 1726853620.68643: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853620.68656: Calling all_plugins_play to load vars for managed_node3 28023 1726853620.68662: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853620.68665: Calling groups_plugins_play to load vars for managed_node3 28023 1726853620.69143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853620.69403: done with get_vars() 28023 1726853620.69411: variable 'ansible_search_path' from source: unknown 28023 1726853620.69412: variable 'ansible_search_path' from source: unknown 28023 1726853620.69450: we have included files to process 28023 1726853620.69451: generating all_blocks data 28023 1726853620.69453: done generating all_blocks data 28023 1726853620.69460: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28023 1726853620.69461: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28023 1726853620.69464: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28023 1726853620.70148: done processing included file 28023 1726853620.70150: iterating over new_blocks loaded from include file 28023 1726853620.70152: in VariableManager get_vars() 28023 1726853620.70181: done with get_vars() 28023 1726853620.70183: filtering new block on tags 28023 1726853620.70200: done filtering new block on tags 28023 1726853620.70203: in VariableManager get_vars() 28023 1726853620.70226: done with get_vars() 28023 1726853620.70228: filtering new block on tags 28023 1726853620.70250: done filtering new block on tags 28023 1726853620.70253: in VariableManager get_vars() 28023 1726853620.70281: done with get_vars() 28023 1726853620.70283: filtering new block on tags 28023 1726853620.70301: done filtering new block on tags 28023 1726853620.70303: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 28023 1726853620.70308: extending task lists for all hosts with included blocks 28023 1726853620.71131: done extending task lists 28023 1726853620.71133: done processing included files 28023 1726853620.71134: results queue empty 28023 1726853620.71135: checking for any_errors_fatal 28023 1726853620.71137: done checking for any_errors_fatal 28023 1726853620.71138: checking for max_fail_percentage 28023 1726853620.71139: done checking for max_fail_percentage 28023 1726853620.71140: checking to see if all hosts have failed and the running result is not ok 28023 1726853620.71141: done checking to see if all hosts have failed 28023 1726853620.71142: getting the remaining hosts for this loop 28023 1726853620.71143: done getting the remaining hosts for this loop 28023 1726853620.71145: getting the next task for host managed_node3 28023 1726853620.71149: done getting next task for host managed_node3 28023 1726853620.71151: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28023 1726853620.71154: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853620.71165: getting variables 28023 1726853620.71167: in VariableManager get_vars() 28023 1726853620.71183: Calling all_inventory to load vars for managed_node3 28023 1726853620.71186: Calling groups_inventory to load vars for managed_node3 28023 1726853620.71188: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853620.71193: Calling all_plugins_play to load vars for managed_node3 28023 1726853620.71195: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853620.71198: Calling groups_plugins_play to load vars for managed_node3 28023 1726853620.71377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853620.71598: done with get_vars() 28023 1726853620.71607: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:33:40 -0400 (0:00:00.046) 0:00:12.800 ****** 28023 1726853620.71681: entering _queue_task() for managed_node3/setup 28023 1726853620.71940: worker is 1 (out of 1 available) 28023 1726853620.71952: exiting _queue_task() for managed_node3/setup 28023 1726853620.71967: done queuing things up, now waiting for results queue to drain 28023 1726853620.71969: waiting for pending results... 28023 1726853620.72289: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28023 1726853620.72399: in run() - task 02083763-bbaf-fdb6-dad7-00000000049b 28023 1726853620.72419: variable 'ansible_search_path' from source: unknown 28023 1726853620.72427: variable 'ansible_search_path' from source: unknown 28023 1726853620.72472: calling self._execute() 28023 1726853620.72565: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853620.72822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853620.72837: variable 'omit' from source: magic vars 28023 1726853620.73197: variable 'ansible_distribution_major_version' from source: facts 28023 1726853620.73214: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853620.73463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853620.76681: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853620.76756: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853620.76840: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853620.76875: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853620.76902: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853620.77219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853620.77247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853620.77275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853620.77451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853620.77466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853620.77519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853620.77672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853620.77879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853620.77883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853620.77885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853620.78576: variable '__network_required_facts' from source: role '' defaults 28023 1726853620.78579: variable 'ansible_facts' from source: unknown 28023 1726853620.78580: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28023 1726853620.78582: when evaluation is False, skipping this task 28023 1726853620.78584: _execute() done 28023 1726853620.78586: dumping result to json 28023 1726853620.78587: done dumping result, returning 28023 1726853620.78589: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-fdb6-dad7-00000000049b] 28023 1726853620.78591: sending task result for task 02083763-bbaf-fdb6-dad7-00000000049b 28023 1726853620.78658: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000049b 28023 1726853620.78663: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28023 1726853620.78711: no more pending results, returning what we have 28023 1726853620.78714: results queue empty 28023 1726853620.78715: checking for any_errors_fatal 28023 1726853620.78717: done checking for any_errors_fatal 28023 1726853620.78717: checking for max_fail_percentage 28023 1726853620.78719: done checking for max_fail_percentage 28023 1726853620.78720: checking to see if all hosts have failed and the running result is not ok 28023 1726853620.78721: done checking to see if all hosts have failed 28023 1726853620.78722: getting the remaining hosts for this loop 28023 1726853620.78728: done getting the remaining hosts for this loop 28023 1726853620.78732: getting the next task for host managed_node3 28023 1726853620.78743: done getting next task for host managed_node3 28023 1726853620.78747: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28023 1726853620.78752: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853620.78769: getting variables 28023 1726853620.78773: in VariableManager get_vars() 28023 1726853620.78821: Calling all_inventory to load vars for managed_node3 28023 1726853620.78824: Calling groups_inventory to load vars for managed_node3 28023 1726853620.78826: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853620.78964: Calling all_plugins_play to load vars for managed_node3 28023 1726853620.78969: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853620.78977: Calling groups_plugins_play to load vars for managed_node3 28023 1726853620.79415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853620.79750: done with get_vars() 28023 1726853620.79766: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:33:40 -0400 (0:00:00.082) 0:00:12.882 ****** 28023 1726853620.79896: entering _queue_task() for managed_node3/stat 28023 1726853620.80122: worker is 1 (out of 1 available) 28023 1726853620.80133: exiting _queue_task() for managed_node3/stat 28023 1726853620.80144: done queuing things up, now waiting for results queue to drain 28023 1726853620.80145: waiting for pending results... 28023 1726853620.80322: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 28023 1726853620.80418: in run() - task 02083763-bbaf-fdb6-dad7-00000000049d 28023 1726853620.80430: variable 'ansible_search_path' from source: unknown 28023 1726853620.80435: variable 'ansible_search_path' from source: unknown 28023 1726853620.80464: calling self._execute() 28023 1726853620.80535: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853620.80538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853620.80546: variable 'omit' from source: magic vars 28023 1726853620.80813: variable 'ansible_distribution_major_version' from source: facts 28023 1726853620.80820: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853620.80935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853620.81216: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853620.81258: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853620.81317: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853620.81343: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853620.81476: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853620.81479: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853620.81485: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853620.81514: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853620.81613: variable '__network_is_ostree' from source: set_fact 28023 1726853620.81624: Evaluated conditional (not __network_is_ostree is defined): False 28023 1726853620.81632: when evaluation is False, skipping this task 28023 1726853620.81638: _execute() done 28023 1726853620.81644: dumping result to json 28023 1726853620.81650: done dumping result, returning 28023 1726853620.81663: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-fdb6-dad7-00000000049d] 28023 1726853620.81877: sending task result for task 02083763-bbaf-fdb6-dad7-00000000049d 28023 1726853620.81955: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000049d 28023 1726853620.81960: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28023 1726853620.82013: no more pending results, returning what we have 28023 1726853620.82017: results queue empty 28023 1726853620.82018: checking for any_errors_fatal 28023 1726853620.82031: done checking for any_errors_fatal 28023 1726853620.82032: checking for max_fail_percentage 28023 1726853620.82034: done checking for max_fail_percentage 28023 1726853620.82035: checking to see if all hosts have failed and the running result is not ok 28023 1726853620.82036: done checking to see if all hosts have failed 28023 1726853620.82037: getting the remaining hosts for this loop 28023 1726853620.82038: done getting the remaining hosts for this loop 28023 1726853620.82042: getting the next task for host managed_node3 28023 1726853620.82049: done getting next task for host managed_node3 28023 1726853620.82053: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28023 1726853620.82059: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853620.82078: getting variables 28023 1726853620.82080: in VariableManager get_vars() 28023 1726853620.82123: Calling all_inventory to load vars for managed_node3 28023 1726853620.82126: Calling groups_inventory to load vars for managed_node3 28023 1726853620.82128: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853620.82140: Calling all_plugins_play to load vars for managed_node3 28023 1726853620.82143: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853620.82146: Calling groups_plugins_play to load vars for managed_node3 28023 1726853620.82452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853620.82634: done with get_vars() 28023 1726853620.82648: done getting variables 28023 1726853620.82703: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:33:40 -0400 (0:00:00.028) 0:00:12.911 ****** 28023 1726853620.82727: entering _queue_task() for managed_node3/set_fact 28023 1726853620.83093: worker is 1 (out of 1 available) 28023 1726853620.83105: exiting _queue_task() for managed_node3/set_fact 28023 1726853620.83117: done queuing things up, now waiting for results queue to drain 28023 1726853620.83119: waiting for pending results... 28023 1726853620.83249: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28023 1726853620.83401: in run() - task 02083763-bbaf-fdb6-dad7-00000000049e 28023 1726853620.83421: variable 'ansible_search_path' from source: unknown 28023 1726853620.83429: variable 'ansible_search_path' from source: unknown 28023 1726853620.83473: calling self._execute() 28023 1726853620.83560: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853620.83574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853620.83587: variable 'omit' from source: magic vars 28023 1726853620.83961: variable 'ansible_distribution_major_version' from source: facts 28023 1726853620.84002: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853620.84105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853620.84386: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853620.84418: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853620.84442: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853620.84467: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853620.84530: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853620.84546: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853620.84565: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853620.84589: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853620.84646: variable '__network_is_ostree' from source: set_fact 28023 1726853620.84652: Evaluated conditional (not __network_is_ostree is defined): False 28023 1726853620.84655: when evaluation is False, skipping this task 28023 1726853620.84661: _execute() done 28023 1726853620.84664: dumping result to json 28023 1726853620.84666: done dumping result, returning 28023 1726853620.84674: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-fdb6-dad7-00000000049e] 28023 1726853620.84678: sending task result for task 02083763-bbaf-fdb6-dad7-00000000049e 28023 1726853620.84761: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000049e 28023 1726853620.84764: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28023 1726853620.84808: no more pending results, returning what we have 28023 1726853620.84812: results queue empty 28023 1726853620.84813: checking for any_errors_fatal 28023 1726853620.84818: done checking for any_errors_fatal 28023 1726853620.84819: checking for max_fail_percentage 28023 1726853620.84820: done checking for max_fail_percentage 28023 1726853620.84821: checking to see if all hosts have failed and the running result is not ok 28023 1726853620.84822: done checking to see if all hosts have failed 28023 1726853620.84823: getting the remaining hosts for this loop 28023 1726853620.84824: done getting the remaining hosts for this loop 28023 1726853620.84827: getting the next task for host managed_node3 28023 1726853620.84836: done getting next task for host managed_node3 28023 1726853620.84839: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28023 1726853620.84843: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853620.84859: getting variables 28023 1726853620.84860: in VariableManager get_vars() 28023 1726853620.84899: Calling all_inventory to load vars for managed_node3 28023 1726853620.84901: Calling groups_inventory to load vars for managed_node3 28023 1726853620.84903: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853620.84911: Calling all_plugins_play to load vars for managed_node3 28023 1726853620.84914: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853620.84916: Calling groups_plugins_play to load vars for managed_node3 28023 1726853620.85089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853620.85218: done with get_vars() 28023 1726853620.85226: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:33:40 -0400 (0:00:00.025) 0:00:12.936 ****** 28023 1726853620.85293: entering _queue_task() for managed_node3/service_facts 28023 1726853620.85294: Creating lock for service_facts 28023 1726853620.85490: worker is 1 (out of 1 available) 28023 1726853620.85502: exiting _queue_task() for managed_node3/service_facts 28023 1726853620.85515: done queuing things up, now waiting for results queue to drain 28023 1726853620.85516: waiting for pending results... 28023 1726853620.85888: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 28023 1726853620.85896: in run() - task 02083763-bbaf-fdb6-dad7-0000000004a0 28023 1726853620.85925: variable 'ansible_search_path' from source: unknown 28023 1726853620.85929: variable 'ansible_search_path' from source: unknown 28023 1726853620.85933: calling self._execute() 28023 1726853620.86004: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853620.86143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853620.86160: variable 'omit' from source: magic vars 28023 1726853620.86543: variable 'ansible_distribution_major_version' from source: facts 28023 1726853620.86564: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853620.86577: variable 'omit' from source: magic vars 28023 1726853620.86659: variable 'omit' from source: magic vars 28023 1726853620.86699: variable 'omit' from source: magic vars 28023 1726853620.86751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853620.86794: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853620.86930: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853620.86934: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853620.86936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853620.86938: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853620.86940: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853620.86942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853620.87019: Set connection var ansible_shell_type to sh 28023 1726853620.87032: Set connection var ansible_shell_executable to /bin/sh 28023 1726853620.87043: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853620.87063: Set connection var ansible_connection to ssh 28023 1726853620.87078: Set connection var ansible_pipelining to False 28023 1726853620.87090: Set connection var ansible_timeout to 10 28023 1726853620.87123: variable 'ansible_shell_executable' from source: unknown 28023 1726853620.87131: variable 'ansible_connection' from source: unknown 28023 1726853620.87137: variable 'ansible_module_compression' from source: unknown 28023 1726853620.87143: variable 'ansible_shell_type' from source: unknown 28023 1726853620.87149: variable 'ansible_shell_executable' from source: unknown 28023 1726853620.87167: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853620.87182: variable 'ansible_pipelining' from source: unknown 28023 1726853620.87217: variable 'ansible_timeout' from source: unknown 28023 1726853620.87220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853620.87474: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28023 1726853620.87483: variable 'omit' from source: magic vars 28023 1726853620.87493: starting attempt loop 28023 1726853620.87499: running the handler 28023 1726853620.87519: _low_level_execute_command(): starting 28023 1726853620.87522: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853620.88016: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853620.88021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 28023 1726853620.88024: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853620.88066: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853620.88083: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853620.88155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853620.89859: stdout chunk (state=3): >>>/root <<< 28023 1726853620.90010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853620.90013: stdout chunk (state=3): >>><<< 28023 1726853620.90015: stderr chunk (state=3): >>><<< 28023 1726853620.90035: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853620.90060: _low_level_execute_command(): starting 28023 1726853620.90152: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853620.9004343-28703-8269113054069 `" && echo ansible-tmp-1726853620.9004343-28703-8269113054069="` echo /root/.ansible/tmp/ansible-tmp-1726853620.9004343-28703-8269113054069 `" ) && sleep 0' 28023 1726853620.90690: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853620.90704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853620.90717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853620.90734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853620.90749: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853620.90763: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853620.90790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853620.90808: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28023 1726853620.90821: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 28023 1726853620.90890: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853620.90922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853620.90938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853620.90962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853620.91105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853620.93096: stdout chunk (state=3): >>>ansible-tmp-1726853620.9004343-28703-8269113054069=/root/.ansible/tmp/ansible-tmp-1726853620.9004343-28703-8269113054069 <<< 28023 1726853620.93377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853620.93381: stdout chunk (state=3): >>><<< 28023 1726853620.93383: stderr chunk (state=3): >>><<< 28023 1726853620.93386: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853620.9004343-28703-8269113054069=/root/.ansible/tmp/ansible-tmp-1726853620.9004343-28703-8269113054069 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853620.93388: variable 'ansible_module_compression' from source: unknown 28023 1726853620.93390: ANSIBALLZ: Using lock for service_facts 28023 1726853620.93392: ANSIBALLZ: Acquiring lock 28023 1726853620.93393: ANSIBALLZ: Lock acquired: 139729392623664 28023 1726853620.93395: ANSIBALLZ: Creating module 28023 1726853621.06854: ANSIBALLZ: Writing module into payload 28023 1726853621.06963: ANSIBALLZ: Writing module 28023 1726853621.06998: ANSIBALLZ: Renaming module 28023 1726853621.07010: ANSIBALLZ: Done creating module 28023 1726853621.07032: variable 'ansible_facts' from source: unknown 28023 1726853621.07123: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853620.9004343-28703-8269113054069/AnsiballZ_service_facts.py 28023 1726853621.07290: Sending initial data 28023 1726853621.07293: Sent initial data (160 bytes) 28023 1726853621.07976: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853621.07993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853621.08040: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853621.08058: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853621.08082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853621.08184: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853621.09857: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853621.09960: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853621.10039: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpt8ymepns /root/.ansible/tmp/ansible-tmp-1726853620.9004343-28703-8269113054069/AnsiballZ_service_facts.py <<< 28023 1726853621.10042: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853620.9004343-28703-8269113054069/AnsiballZ_service_facts.py" <<< 28023 1726853621.10105: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpt8ymepns" to remote "/root/.ansible/tmp/ansible-tmp-1726853620.9004343-28703-8269113054069/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853620.9004343-28703-8269113054069/AnsiballZ_service_facts.py" <<< 28023 1726853621.11049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853621.11173: stderr chunk (state=3): >>><<< 28023 1726853621.11177: stdout chunk (state=3): >>><<< 28023 1726853621.11179: done transferring module to remote 28023 1726853621.11181: _low_level_execute_command(): starting 28023 1726853621.11183: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853620.9004343-28703-8269113054069/ /root/.ansible/tmp/ansible-tmp-1726853620.9004343-28703-8269113054069/AnsiballZ_service_facts.py && sleep 0' 28023 1726853621.11826: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853621.11853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853621.11952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853621.11988: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853621.12002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853621.12019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853621.12101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853621.14033: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853621.14061: stdout chunk (state=3): >>><<< 28023 1726853621.14065: stderr chunk (state=3): >>><<< 28023 1726853621.14081: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853621.14089: _low_level_execute_command(): starting 28023 1726853621.14168: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853620.9004343-28703-8269113054069/AnsiballZ_service_facts.py && sleep 0' 28023 1726853621.14617: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853621.14623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853621.14653: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853621.14656: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853621.14659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853621.14661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853621.14711: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853621.14715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853621.14724: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853621.14798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853622.77819: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28023 1726853622.79497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853622.79501: stdout chunk (state=3): >>><<< 28023 1726853622.79503: stderr chunk (state=3): >>><<< 28023 1726853622.79636: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853622.81565: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853620.9004343-28703-8269113054069/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853622.81569: _low_level_execute_command(): starting 28023 1726853622.81577: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853620.9004343-28703-8269113054069/ > /dev/null 2>&1 && sleep 0' 28023 1726853622.82182: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853622.82185: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853622.82188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853622.82190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853622.82193: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853622.82195: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853622.82197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853622.82199: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28023 1726853622.82201: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 28023 1726853622.82203: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28023 1726853622.82205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853622.82207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853622.82209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853622.82212: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853622.82214: stderr chunk (state=3): >>>debug2: match found <<< 28023 1726853622.82230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853622.82297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853622.82309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853622.82318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853622.82407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853622.84446: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853622.84457: stdout chunk (state=3): >>><<< 28023 1726853622.84486: stderr chunk (state=3): >>><<< 28023 1726853622.84504: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853622.84577: handler run complete 28023 1726853622.84731: variable 'ansible_facts' from source: unknown 28023 1726853622.84949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853622.85249: variable 'ansible_facts' from source: unknown 28023 1726853622.85331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853622.85445: attempt loop complete, returning result 28023 1726853622.85449: _execute() done 28023 1726853622.85451: dumping result to json 28023 1726853622.85490: done dumping result, returning 28023 1726853622.85505: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-fdb6-dad7-0000000004a0] 28023 1726853622.85508: sending task result for task 02083763-bbaf-fdb6-dad7-0000000004a0 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28023 1726853622.86065: no more pending results, returning what we have 28023 1726853622.86068: results queue empty 28023 1726853622.86069: checking for any_errors_fatal 28023 1726853622.86074: done checking for any_errors_fatal 28023 1726853622.86075: checking for max_fail_percentage 28023 1726853622.86077: done checking for max_fail_percentage 28023 1726853622.86077: checking to see if all hosts have failed and the running result is not ok 28023 1726853622.86078: done checking to see if all hosts have failed 28023 1726853622.86079: getting the remaining hosts for this loop 28023 1726853622.86080: done getting the remaining hosts for this loop 28023 1726853622.86089: getting the next task for host managed_node3 28023 1726853622.86095: done getting next task for host managed_node3 28023 1726853622.86098: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28023 1726853622.86101: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853622.86110: getting variables 28023 1726853622.86111: in VariableManager get_vars() 28023 1726853622.86142: Calling all_inventory to load vars for managed_node3 28023 1726853622.86144: Calling groups_inventory to load vars for managed_node3 28023 1726853622.86145: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853622.86153: Calling all_plugins_play to load vars for managed_node3 28023 1726853622.86155: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853622.86160: Calling groups_plugins_play to load vars for managed_node3 28023 1726853622.86166: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000004a0 28023 1726853622.86556: WORKER PROCESS EXITING 28023 1726853622.86567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853622.86850: done with get_vars() 28023 1726853622.86862: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:33:42 -0400 (0:00:02.016) 0:00:14.953 ****** 28023 1726853622.86929: entering _queue_task() for managed_node3/package_facts 28023 1726853622.86930: Creating lock for package_facts 28023 1726853622.87284: worker is 1 (out of 1 available) 28023 1726853622.87297: exiting _queue_task() for managed_node3/package_facts 28023 1726853622.87320: done queuing things up, now waiting for results queue to drain 28023 1726853622.87322: waiting for pending results... 28023 1726853622.87658: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 28023 1726853622.87790: in run() - task 02083763-bbaf-fdb6-dad7-0000000004a1 28023 1726853622.87812: variable 'ansible_search_path' from source: unknown 28023 1726853622.87820: variable 'ansible_search_path' from source: unknown 28023 1726853622.87898: calling self._execute() 28023 1726853622.87994: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853622.87998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853622.88001: variable 'omit' from source: magic vars 28023 1726853622.88297: variable 'ansible_distribution_major_version' from source: facts 28023 1726853622.88304: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853622.88307: variable 'omit' from source: magic vars 28023 1726853622.88368: variable 'omit' from source: magic vars 28023 1726853622.88576: variable 'omit' from source: magic vars 28023 1726853622.88580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853622.88583: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853622.88585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853622.88587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853622.88590: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853622.88592: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853622.88594: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853622.88597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853622.88692: Set connection var ansible_shell_type to sh 28023 1726853622.88706: Set connection var ansible_shell_executable to /bin/sh 28023 1726853622.88719: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853622.88729: Set connection var ansible_connection to ssh 28023 1726853622.88740: Set connection var ansible_pipelining to False 28023 1726853622.88751: Set connection var ansible_timeout to 10 28023 1726853622.88783: variable 'ansible_shell_executable' from source: unknown 28023 1726853622.88792: variable 'ansible_connection' from source: unknown 28023 1726853622.88800: variable 'ansible_module_compression' from source: unknown 28023 1726853622.88807: variable 'ansible_shell_type' from source: unknown 28023 1726853622.88814: variable 'ansible_shell_executable' from source: unknown 28023 1726853622.88822: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853622.88833: variable 'ansible_pipelining' from source: unknown 28023 1726853622.88840: variable 'ansible_timeout' from source: unknown 28023 1726853622.88848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853622.89048: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28023 1726853622.89068: variable 'omit' from source: magic vars 28023 1726853622.89083: starting attempt loop 28023 1726853622.89091: running the handler 28023 1726853622.89112: _low_level_execute_command(): starting 28023 1726853622.89126: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853622.89812: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853622.89831: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853622.89848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853622.89870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853622.89892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853622.89907: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853622.89924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853622.89945: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28023 1726853622.89960: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 28023 1726853622.89981: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28023 1726853622.89994: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853622.90078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853622.90105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853622.90203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853622.91912: stdout chunk (state=3): >>>/root <<< 28023 1726853622.92017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853622.92038: stderr chunk (state=3): >>><<< 28023 1726853622.92041: stdout chunk (state=3): >>><<< 28023 1726853622.92067: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853622.92080: _low_level_execute_command(): starting 28023 1726853622.92085: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853622.920657-28768-190891390366175 `" && echo ansible-tmp-1726853622.920657-28768-190891390366175="` echo /root/.ansible/tmp/ansible-tmp-1726853622.920657-28768-190891390366175 `" ) && sleep 0' 28023 1726853622.92525: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853622.92536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853622.92539: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853622.92542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853622.92582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853622.92585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853622.92658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853622.94608: stdout chunk (state=3): >>>ansible-tmp-1726853622.920657-28768-190891390366175=/root/.ansible/tmp/ansible-tmp-1726853622.920657-28768-190891390366175 <<< 28023 1726853622.94716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853622.94743: stderr chunk (state=3): >>><<< 28023 1726853622.94749: stdout chunk (state=3): >>><<< 28023 1726853622.94765: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853622.920657-28768-190891390366175=/root/.ansible/tmp/ansible-tmp-1726853622.920657-28768-190891390366175 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853622.94806: variable 'ansible_module_compression' from source: unknown 28023 1726853622.94845: ANSIBALLZ: Using lock for package_facts 28023 1726853622.94848: ANSIBALLZ: Acquiring lock 28023 1726853622.94851: ANSIBALLZ: Lock acquired: 139729396792016 28023 1726853622.94853: ANSIBALLZ: Creating module 28023 1726853623.22979: ANSIBALLZ: Writing module into payload 28023 1726853623.22984: ANSIBALLZ: Writing module 28023 1726853623.23021: ANSIBALLZ: Renaming module 28023 1726853623.23034: ANSIBALLZ: Done creating module 28023 1726853623.23078: variable 'ansible_facts' from source: unknown 28023 1726853623.23283: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853622.920657-28768-190891390366175/AnsiballZ_package_facts.py 28023 1726853623.23510: Sending initial data 28023 1726853623.23519: Sent initial data (161 bytes) 28023 1726853623.24405: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853623.24463: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853623.24498: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853623.24607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853623.24798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853623.26484: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28023 1726853623.26500: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853623.26536: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853623.26598: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpcp7qu10k /root/.ansible/tmp/ansible-tmp-1726853622.920657-28768-190891390366175/AnsiballZ_package_facts.py <<< 28023 1726853623.26611: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853622.920657-28768-190891390366175/AnsiballZ_package_facts.py" <<< 28023 1726853623.26676: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpcp7qu10k" to remote "/root/.ansible/tmp/ansible-tmp-1726853622.920657-28768-190891390366175/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853622.920657-28768-190891390366175/AnsiballZ_package_facts.py" <<< 28023 1726853623.29582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853623.29586: stderr chunk (state=3): >>><<< 28023 1726853623.29589: stdout chunk (state=3): >>><<< 28023 1726853623.29591: done transferring module to remote 28023 1726853623.29593: _low_level_execute_command(): starting 28023 1726853623.29595: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853622.920657-28768-190891390366175/ /root/.ansible/tmp/ansible-tmp-1726853622.920657-28768-190891390366175/AnsiballZ_package_facts.py && sleep 0' 28023 1726853623.30830: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853623.31052: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853623.31153: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853623.33066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853623.33082: stdout chunk (state=3): >>><<< 28023 1726853623.33113: stderr chunk (state=3): >>><<< 28023 1726853623.33373: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853623.33379: _low_level_execute_command(): starting 28023 1726853623.33381: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853622.920657-28768-190891390366175/AnsiballZ_package_facts.py && sleep 0' 28023 1726853623.34467: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853623.34483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 28023 1726853623.34560: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853623.34576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853623.34620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853623.34642: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853623.34896: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853623.34958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853623.79709: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 28023 1726853623.79780: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 28023 1726853623.79860: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 28023 1726853623.79915: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 28023 1726853623.79933: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28023 1726853623.81892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853623.81896: stdout chunk (state=3): >>><<< 28023 1726853623.81898: stderr chunk (state=3): >>><<< 28023 1726853623.82289: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853623.85265: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853622.920657-28768-190891390366175/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853623.85299: _low_level_execute_command(): starting 28023 1726853623.85310: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853622.920657-28768-190891390366175/ > /dev/null 2>&1 && sleep 0' 28023 1726853623.85918: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853623.85933: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853623.85947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853623.85970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853623.85991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853623.86004: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853623.86019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853623.86037: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28023 1726853623.86050: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 28023 1726853623.86139: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853623.86150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853623.86245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853623.88195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853623.88199: stdout chunk (state=3): >>><<< 28023 1726853623.88203: stderr chunk (state=3): >>><<< 28023 1726853623.88224: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853623.88230: handler run complete 28023 1726853623.89068: variable 'ansible_facts' from source: unknown 28023 1726853623.89520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853623.91507: variable 'ansible_facts' from source: unknown 28023 1726853623.91935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853623.92978: attempt loop complete, returning result 28023 1726853623.92982: _execute() done 28023 1726853623.92984: dumping result to json 28023 1726853623.93228: done dumping result, returning 28023 1726853623.93232: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-fdb6-dad7-0000000004a1] 28023 1726853623.93234: sending task result for task 02083763-bbaf-fdb6-dad7-0000000004a1 28023 1726853623.95852: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000004a1 28023 1726853623.95855: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28023 1726853623.95955: no more pending results, returning what we have 28023 1726853623.95960: results queue empty 28023 1726853623.95961: checking for any_errors_fatal 28023 1726853623.95966: done checking for any_errors_fatal 28023 1726853623.95967: checking for max_fail_percentage 28023 1726853623.95968: done checking for max_fail_percentage 28023 1726853623.95969: checking to see if all hosts have failed and the running result is not ok 28023 1726853623.95992: done checking to see if all hosts have failed 28023 1726853623.95994: getting the remaining hosts for this loop 28023 1726853623.95995: done getting the remaining hosts for this loop 28023 1726853623.95999: getting the next task for host managed_node3 28023 1726853623.96006: done getting next task for host managed_node3 28023 1726853623.96009: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28023 1726853623.96013: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853623.96023: getting variables 28023 1726853623.96025: in VariableManager get_vars() 28023 1726853623.96061: Calling all_inventory to load vars for managed_node3 28023 1726853623.96064: Calling groups_inventory to load vars for managed_node3 28023 1726853623.96067: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853623.96079: Calling all_plugins_play to load vars for managed_node3 28023 1726853623.96082: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853623.96085: Calling groups_plugins_play to load vars for managed_node3 28023 1726853623.98989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853624.01418: done with get_vars() 28023 1726853624.01450: done getting variables 28023 1726853624.01514: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:33:44 -0400 (0:00:01.146) 0:00:16.099 ****** 28023 1726853624.01555: entering _queue_task() for managed_node3/debug 28023 1726853624.02083: worker is 1 (out of 1 available) 28023 1726853624.02094: exiting _queue_task() for managed_node3/debug 28023 1726853624.02105: done queuing things up, now waiting for results queue to drain 28023 1726853624.02106: waiting for pending results... 28023 1726853624.02288: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 28023 1726853624.02347: in run() - task 02083763-bbaf-fdb6-dad7-00000000001c 28023 1726853624.02373: variable 'ansible_search_path' from source: unknown 28023 1726853624.02440: variable 'ansible_search_path' from source: unknown 28023 1726853624.02444: calling self._execute() 28023 1726853624.02519: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853624.02530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853624.02547: variable 'omit' from source: magic vars 28023 1726853624.02941: variable 'ansible_distribution_major_version' from source: facts 28023 1726853624.02962: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853624.02975: variable 'omit' from source: magic vars 28023 1726853624.03036: variable 'omit' from source: magic vars 28023 1726853624.03143: variable 'network_provider' from source: set_fact 28023 1726853624.03200: variable 'omit' from source: magic vars 28023 1726853624.03220: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853624.03262: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853624.03291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853624.03317: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853624.03418: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853624.03421: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853624.03424: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853624.03427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853624.03501: Set connection var ansible_shell_type to sh 28023 1726853624.03515: Set connection var ansible_shell_executable to /bin/sh 28023 1726853624.03530: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853624.03540: Set connection var ansible_connection to ssh 28023 1726853624.03549: Set connection var ansible_pipelining to False 28023 1726853624.03562: Set connection var ansible_timeout to 10 28023 1726853624.03595: variable 'ansible_shell_executable' from source: unknown 28023 1726853624.03604: variable 'ansible_connection' from source: unknown 28023 1726853624.03611: variable 'ansible_module_compression' from source: unknown 28023 1726853624.03618: variable 'ansible_shell_type' from source: unknown 28023 1726853624.03625: variable 'ansible_shell_executable' from source: unknown 28023 1726853624.03636: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853624.03644: variable 'ansible_pipelining' from source: unknown 28023 1726853624.03651: variable 'ansible_timeout' from source: unknown 28023 1726853624.03661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853624.03810: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853624.03851: variable 'omit' from source: magic vars 28023 1726853624.03854: starting attempt loop 28023 1726853624.03859: running the handler 28023 1726853624.03896: handler run complete 28023 1726853624.03914: attempt loop complete, returning result 28023 1726853624.03962: _execute() done 28023 1726853624.03965: dumping result to json 28023 1726853624.03968: done dumping result, returning 28023 1726853624.03970: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-fdb6-dad7-00000000001c] 28023 1726853624.03974: sending task result for task 02083763-bbaf-fdb6-dad7-00000000001c ok: [managed_node3] => {} MSG: Using network provider: nm 28023 1726853624.04126: no more pending results, returning what we have 28023 1726853624.04130: results queue empty 28023 1726853624.04130: checking for any_errors_fatal 28023 1726853624.04139: done checking for any_errors_fatal 28023 1726853624.04139: checking for max_fail_percentage 28023 1726853624.04142: done checking for max_fail_percentage 28023 1726853624.04143: checking to see if all hosts have failed and the running result is not ok 28023 1726853624.04144: done checking to see if all hosts have failed 28023 1726853624.04144: getting the remaining hosts for this loop 28023 1726853624.04146: done getting the remaining hosts for this loop 28023 1726853624.04150: getting the next task for host managed_node3 28023 1726853624.04160: done getting next task for host managed_node3 28023 1726853624.04164: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28023 1726853624.04169: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853624.04182: getting variables 28023 1726853624.04184: in VariableManager get_vars() 28023 1726853624.04227: Calling all_inventory to load vars for managed_node3 28023 1726853624.04230: Calling groups_inventory to load vars for managed_node3 28023 1726853624.04232: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853624.04245: Calling all_plugins_play to load vars for managed_node3 28023 1726853624.04248: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853624.04251: Calling groups_plugins_play to load vars for managed_node3 28023 1726853624.04986: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000001c 28023 1726853624.04989: WORKER PROCESS EXITING 28023 1726853624.05935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853624.07583: done with get_vars() 28023 1726853624.07615: done getting variables 28023 1726853624.07680: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:33:44 -0400 (0:00:00.061) 0:00:16.161 ****** 28023 1726853624.07715: entering _queue_task() for managed_node3/fail 28023 1726853624.08053: worker is 1 (out of 1 available) 28023 1726853624.08068: exiting _queue_task() for managed_node3/fail 28023 1726853624.08085: done queuing things up, now waiting for results queue to drain 28023 1726853624.08086: waiting for pending results... 28023 1726853624.08391: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28023 1726853624.08530: in run() - task 02083763-bbaf-fdb6-dad7-00000000001d 28023 1726853624.08549: variable 'ansible_search_path' from source: unknown 28023 1726853624.08560: variable 'ansible_search_path' from source: unknown 28023 1726853624.08603: calling self._execute() 28023 1726853624.08703: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853624.08714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853624.08731: variable 'omit' from source: magic vars 28023 1726853624.09101: variable 'ansible_distribution_major_version' from source: facts 28023 1726853624.09118: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853624.09233: variable 'network_state' from source: role '' defaults 28023 1726853624.09247: Evaluated conditional (network_state != {}): False 28023 1726853624.09254: when evaluation is False, skipping this task 28023 1726853624.09263: _execute() done 28023 1726853624.09276: dumping result to json 28023 1726853624.09283: done dumping result, returning 28023 1726853624.09292: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-fdb6-dad7-00000000001d] 28023 1726853624.09301: sending task result for task 02083763-bbaf-fdb6-dad7-00000000001d 28023 1726853624.09505: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000001d 28023 1726853624.09509: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28023 1726853624.09555: no more pending results, returning what we have 28023 1726853624.09562: results queue empty 28023 1726853624.09562: checking for any_errors_fatal 28023 1726853624.09569: done checking for any_errors_fatal 28023 1726853624.09569: checking for max_fail_percentage 28023 1726853624.09573: done checking for max_fail_percentage 28023 1726853624.09574: checking to see if all hosts have failed and the running result is not ok 28023 1726853624.09575: done checking to see if all hosts have failed 28023 1726853624.09576: getting the remaining hosts for this loop 28023 1726853624.09578: done getting the remaining hosts for this loop 28023 1726853624.09581: getting the next task for host managed_node3 28023 1726853624.09588: done getting next task for host managed_node3 28023 1726853624.09592: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28023 1726853624.09596: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853624.09613: getting variables 28023 1726853624.09615: in VariableManager get_vars() 28023 1726853624.09662: Calling all_inventory to load vars for managed_node3 28023 1726853624.09665: Calling groups_inventory to load vars for managed_node3 28023 1726853624.09668: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853624.09878: Calling all_plugins_play to load vars for managed_node3 28023 1726853624.09882: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853624.09885: Calling groups_plugins_play to load vars for managed_node3 28023 1726853624.11351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853624.13101: done with get_vars() 28023 1726853624.13125: done getting variables 28023 1726853624.13189: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:33:44 -0400 (0:00:00.055) 0:00:16.216 ****** 28023 1726853624.13222: entering _queue_task() for managed_node3/fail 28023 1726853624.13559: worker is 1 (out of 1 available) 28023 1726853624.13774: exiting _queue_task() for managed_node3/fail 28023 1726853624.13786: done queuing things up, now waiting for results queue to drain 28023 1726853624.13787: waiting for pending results... 28023 1726853624.13869: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28023 1726853624.14121: in run() - task 02083763-bbaf-fdb6-dad7-00000000001e 28023 1726853624.14125: variable 'ansible_search_path' from source: unknown 28023 1726853624.14128: variable 'ansible_search_path' from source: unknown 28023 1726853624.14130: calling self._execute() 28023 1726853624.14183: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853624.14195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853624.14207: variable 'omit' from source: magic vars 28023 1726853624.14581: variable 'ansible_distribution_major_version' from source: facts 28023 1726853624.14597: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853624.14719: variable 'network_state' from source: role '' defaults 28023 1726853624.14737: Evaluated conditional (network_state != {}): False 28023 1726853624.14745: when evaluation is False, skipping this task 28023 1726853624.14753: _execute() done 28023 1726853624.14763: dumping result to json 28023 1726853624.14778: done dumping result, returning 28023 1726853624.14791: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-fdb6-dad7-00000000001e] 28023 1726853624.14803: sending task result for task 02083763-bbaf-fdb6-dad7-00000000001e skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28023 1726853624.15068: no more pending results, returning what we have 28023 1726853624.15073: results queue empty 28023 1726853624.15074: checking for any_errors_fatal 28023 1726853624.15084: done checking for any_errors_fatal 28023 1726853624.15085: checking for max_fail_percentage 28023 1726853624.15087: done checking for max_fail_percentage 28023 1726853624.15088: checking to see if all hosts have failed and the running result is not ok 28023 1726853624.15089: done checking to see if all hosts have failed 28023 1726853624.15090: getting the remaining hosts for this loop 28023 1726853624.15091: done getting the remaining hosts for this loop 28023 1726853624.15095: getting the next task for host managed_node3 28023 1726853624.15102: done getting next task for host managed_node3 28023 1726853624.15108: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28023 1726853624.15112: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853624.15130: getting variables 28023 1726853624.15132: in VariableManager get_vars() 28023 1726853624.15382: Calling all_inventory to load vars for managed_node3 28023 1726853624.15385: Calling groups_inventory to load vars for managed_node3 28023 1726853624.15388: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853624.15398: Calling all_plugins_play to load vars for managed_node3 28023 1726853624.15401: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853624.15404: Calling groups_plugins_play to load vars for managed_node3 28023 1726853624.16084: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000001e 28023 1726853624.16088: WORKER PROCESS EXITING 28023 1726853624.16752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853624.18314: done with get_vars() 28023 1726853624.18343: done getting variables 28023 1726853624.18406: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:33:44 -0400 (0:00:00.052) 0:00:16.268 ****** 28023 1726853624.18440: entering _queue_task() for managed_node3/fail 28023 1726853624.18783: worker is 1 (out of 1 available) 28023 1726853624.18794: exiting _queue_task() for managed_node3/fail 28023 1726853624.18806: done queuing things up, now waiting for results queue to drain 28023 1726853624.18808: waiting for pending results... 28023 1726853624.19095: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28023 1726853624.19254: in run() - task 02083763-bbaf-fdb6-dad7-00000000001f 28023 1726853624.19279: variable 'ansible_search_path' from source: unknown 28023 1726853624.19287: variable 'ansible_search_path' from source: unknown 28023 1726853624.19336: calling self._execute() 28023 1726853624.19428: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853624.19442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853624.19459: variable 'omit' from source: magic vars 28023 1726853624.19844: variable 'ansible_distribution_major_version' from source: facts 28023 1726853624.19868: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853624.20055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853624.21781: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853624.21833: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853624.21864: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853624.21891: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853624.21912: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853624.21976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853624.21996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853624.22014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.22039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853624.22052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853624.22126: variable 'ansible_distribution_major_version' from source: facts 28023 1726853624.22139: Evaluated conditional (ansible_distribution_major_version | int > 9): True 28023 1726853624.22224: variable 'ansible_distribution' from source: facts 28023 1726853624.22227: variable '__network_rh_distros' from source: role '' defaults 28023 1726853624.22236: Evaluated conditional (ansible_distribution in __network_rh_distros): True 28023 1726853624.22410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853624.22429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853624.22447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.22477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853624.22511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853624.22741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853624.22745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853624.22747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.22750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853624.22752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853624.22755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853624.22761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853624.22763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.22765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853624.22783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853624.23113: variable 'network_connections' from source: task vars 28023 1726853624.23129: variable 'interface0' from source: play vars 28023 1726853624.23204: variable 'interface0' from source: play vars 28023 1726853624.23222: variable 'interface0' from source: play vars 28023 1726853624.23286: variable 'interface0' from source: play vars 28023 1726853624.23303: variable 'interface1' from source: play vars 28023 1726853624.23431: variable 'interface1' from source: play vars 28023 1726853624.23435: variable 'interface1' from source: play vars 28023 1726853624.23509: variable 'interface1' from source: play vars 28023 1726853624.23513: variable 'network_state' from source: role '' defaults 28023 1726853624.23600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853624.23730: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853624.23766: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853624.23791: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853624.23814: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853624.23864: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853624.23882: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853624.23899: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.23917: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853624.23946: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 28023 1726853624.23949: when evaluation is False, skipping this task 28023 1726853624.23952: _execute() done 28023 1726853624.23954: dumping result to json 28023 1726853624.23958: done dumping result, returning 28023 1726853624.23973: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-fdb6-dad7-00000000001f] 28023 1726853624.23976: sending task result for task 02083763-bbaf-fdb6-dad7-00000000001f 28023 1726853624.24060: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000001f 28023 1726853624.24062: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 28023 1726853624.24113: no more pending results, returning what we have 28023 1726853624.24116: results queue empty 28023 1726853624.24117: checking for any_errors_fatal 28023 1726853624.24123: done checking for any_errors_fatal 28023 1726853624.24124: checking for max_fail_percentage 28023 1726853624.24125: done checking for max_fail_percentage 28023 1726853624.24126: checking to see if all hosts have failed and the running result is not ok 28023 1726853624.24127: done checking to see if all hosts have failed 28023 1726853624.24128: getting the remaining hosts for this loop 28023 1726853624.24129: done getting the remaining hosts for this loop 28023 1726853624.24133: getting the next task for host managed_node3 28023 1726853624.24138: done getting next task for host managed_node3 28023 1726853624.24142: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28023 1726853624.24145: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853624.24158: getting variables 28023 1726853624.24159: in VariableManager get_vars() 28023 1726853624.24205: Calling all_inventory to load vars for managed_node3 28023 1726853624.24207: Calling groups_inventory to load vars for managed_node3 28023 1726853624.24210: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853624.24221: Calling all_plugins_play to load vars for managed_node3 28023 1726853624.24223: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853624.24226: Calling groups_plugins_play to load vars for managed_node3 28023 1726853624.25131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853624.29144: done with get_vars() 28023 1726853624.29173: done getting variables 28023 1726853624.29251: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:33:44 -0400 (0:00:00.108) 0:00:16.376 ****** 28023 1726853624.29283: entering _queue_task() for managed_node3/dnf 28023 1726853624.29601: worker is 1 (out of 1 available) 28023 1726853624.29612: exiting _queue_task() for managed_node3/dnf 28023 1726853624.29624: done queuing things up, now waiting for results queue to drain 28023 1726853624.29625: waiting for pending results... 28023 1726853624.30091: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28023 1726853624.30097: in run() - task 02083763-bbaf-fdb6-dad7-000000000020 28023 1726853624.30100: variable 'ansible_search_path' from source: unknown 28023 1726853624.30102: variable 'ansible_search_path' from source: unknown 28023 1726853624.30121: calling self._execute() 28023 1726853624.30219: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853624.30235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853624.30248: variable 'omit' from source: magic vars 28023 1726853624.30654: variable 'ansible_distribution_major_version' from source: facts 28023 1726853624.30675: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853624.30820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853624.32366: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853624.32476: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853624.32502: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853624.32577: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853624.32676: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853624.32682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853624.32706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853624.32739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.32789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853624.32808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853624.32936: variable 'ansible_distribution' from source: facts 28023 1726853624.32947: variable 'ansible_distribution_major_version' from source: facts 28023 1726853624.32972: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28023 1726853624.33114: variable '__network_wireless_connections_defined' from source: role '' defaults 28023 1726853624.33288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853624.33304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853624.33352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.33402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853624.33432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853624.33463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853624.33483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853624.33500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.33524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853624.33535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853624.33568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853624.33587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853624.33603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.33637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853624.33650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853624.33755: variable 'network_connections' from source: task vars 28023 1726853624.33763: variable 'interface0' from source: play vars 28023 1726853624.33813: variable 'interface0' from source: play vars 28023 1726853624.33821: variable 'interface0' from source: play vars 28023 1726853624.33866: variable 'interface0' from source: play vars 28023 1726853624.33880: variable 'interface1' from source: play vars 28023 1726853624.33918: variable 'interface1' from source: play vars 28023 1726853624.33924: variable 'interface1' from source: play vars 28023 1726853624.33965: variable 'interface1' from source: play vars 28023 1726853624.34018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853624.34145: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853624.34174: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853624.34198: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853624.34220: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853624.34254: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853624.34272: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853624.34290: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.34310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853624.34354: variable '__network_team_connections_defined' from source: role '' defaults 28023 1726853624.34502: variable 'network_connections' from source: task vars 28023 1726853624.34505: variable 'interface0' from source: play vars 28023 1726853624.34550: variable 'interface0' from source: play vars 28023 1726853624.34558: variable 'interface0' from source: play vars 28023 1726853624.34599: variable 'interface0' from source: play vars 28023 1726853624.34608: variable 'interface1' from source: play vars 28023 1726853624.34652: variable 'interface1' from source: play vars 28023 1726853624.34660: variable 'interface1' from source: play vars 28023 1726853624.34701: variable 'interface1' from source: play vars 28023 1726853624.34728: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28023 1726853624.34731: when evaluation is False, skipping this task 28023 1726853624.34734: _execute() done 28023 1726853624.34736: dumping result to json 28023 1726853624.34738: done dumping result, returning 28023 1726853624.34747: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-fdb6-dad7-000000000020] 28023 1726853624.34750: sending task result for task 02083763-bbaf-fdb6-dad7-000000000020 28023 1726853624.34837: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000020 28023 1726853624.34839: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28023 1726853624.34900: no more pending results, returning what we have 28023 1726853624.34903: results queue empty 28023 1726853624.34904: checking for any_errors_fatal 28023 1726853624.34914: done checking for any_errors_fatal 28023 1726853624.34914: checking for max_fail_percentage 28023 1726853624.34916: done checking for max_fail_percentage 28023 1726853624.34917: checking to see if all hosts have failed and the running result is not ok 28023 1726853624.34918: done checking to see if all hosts have failed 28023 1726853624.34919: getting the remaining hosts for this loop 28023 1726853624.34920: done getting the remaining hosts for this loop 28023 1726853624.34924: getting the next task for host managed_node3 28023 1726853624.34930: done getting next task for host managed_node3 28023 1726853624.34933: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28023 1726853624.34936: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853624.34949: getting variables 28023 1726853624.34951: in VariableManager get_vars() 28023 1726853624.34996: Calling all_inventory to load vars for managed_node3 28023 1726853624.34998: Calling groups_inventory to load vars for managed_node3 28023 1726853624.35001: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853624.35011: Calling all_plugins_play to load vars for managed_node3 28023 1726853624.35014: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853624.35016: Calling groups_plugins_play to load vars for managed_node3 28023 1726853624.36370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853624.37277: done with get_vars() 28023 1726853624.37298: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28023 1726853624.37351: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:33:44 -0400 (0:00:00.080) 0:00:16.457 ****** 28023 1726853624.37379: entering _queue_task() for managed_node3/yum 28023 1726853624.37381: Creating lock for yum 28023 1726853624.37646: worker is 1 (out of 1 available) 28023 1726853624.37659: exiting _queue_task() for managed_node3/yum 28023 1726853624.37679: done queuing things up, now waiting for results queue to drain 28023 1726853624.37681: waiting for pending results... 28023 1726853624.37863: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28023 1726853624.37963: in run() - task 02083763-bbaf-fdb6-dad7-000000000021 28023 1726853624.37978: variable 'ansible_search_path' from source: unknown 28023 1726853624.37982: variable 'ansible_search_path' from source: unknown 28023 1726853624.38013: calling self._execute() 28023 1726853624.38087: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853624.38094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853624.38103: variable 'omit' from source: magic vars 28023 1726853624.38576: variable 'ansible_distribution_major_version' from source: facts 28023 1726853624.38579: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853624.38640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853624.40824: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853624.41239: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853624.41286: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853624.41324: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853624.41354: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853624.41448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853624.41486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853624.41517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.41562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853624.41583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853624.41682: variable 'ansible_distribution_major_version' from source: facts 28023 1726853624.41703: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28023 1726853624.41710: when evaluation is False, skipping this task 28023 1726853624.41717: _execute() done 28023 1726853624.41725: dumping result to json 28023 1726853624.41733: done dumping result, returning 28023 1726853624.41744: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-fdb6-dad7-000000000021] 28023 1726853624.41752: sending task result for task 02083763-bbaf-fdb6-dad7-000000000021 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28023 1726853624.41914: no more pending results, returning what we have 28023 1726853624.41917: results queue empty 28023 1726853624.41918: checking for any_errors_fatal 28023 1726853624.41926: done checking for any_errors_fatal 28023 1726853624.41926: checking for max_fail_percentage 28023 1726853624.41928: done checking for max_fail_percentage 28023 1726853624.41929: checking to see if all hosts have failed and the running result is not ok 28023 1726853624.41930: done checking to see if all hosts have failed 28023 1726853624.41930: getting the remaining hosts for this loop 28023 1726853624.41932: done getting the remaining hosts for this loop 28023 1726853624.41935: getting the next task for host managed_node3 28023 1726853624.41947: done getting next task for host managed_node3 28023 1726853624.41950: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28023 1726853624.41953: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853624.41966: getting variables 28023 1726853624.41967: in VariableManager get_vars() 28023 1726853624.42010: Calling all_inventory to load vars for managed_node3 28023 1726853624.42013: Calling groups_inventory to load vars for managed_node3 28023 1726853624.42015: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853624.42026: Calling all_plugins_play to load vars for managed_node3 28023 1726853624.42029: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853624.42031: Calling groups_plugins_play to load vars for managed_node3 28023 1726853624.42585: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000021 28023 1726853624.42589: WORKER PROCESS EXITING 28023 1726853624.43678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853624.45194: done with get_vars() 28023 1726853624.45218: done getting variables 28023 1726853624.45281: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:33:44 -0400 (0:00:00.079) 0:00:16.537 ****** 28023 1726853624.45315: entering _queue_task() for managed_node3/fail 28023 1726853624.45647: worker is 1 (out of 1 available) 28023 1726853624.45661: exiting _queue_task() for managed_node3/fail 28023 1726853624.45675: done queuing things up, now waiting for results queue to drain 28023 1726853624.45676: waiting for pending results... 28023 1726853624.46090: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28023 1726853624.46129: in run() - task 02083763-bbaf-fdb6-dad7-000000000022 28023 1726853624.46149: variable 'ansible_search_path' from source: unknown 28023 1726853624.46159: variable 'ansible_search_path' from source: unknown 28023 1726853624.46205: calling self._execute() 28023 1726853624.46312: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853624.46329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853624.46343: variable 'omit' from source: magic vars 28023 1726853624.46739: variable 'ansible_distribution_major_version' from source: facts 28023 1726853624.46866: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853624.46895: variable '__network_wireless_connections_defined' from source: role '' defaults 28023 1726853624.47102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853624.49305: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853624.49399: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853624.49463: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853624.49508: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853624.49542: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853624.49633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853624.49672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853624.49710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.49756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853624.49794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853624.49900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853624.49904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853624.49907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.49943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853624.49967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853624.50012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853624.50045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853624.50081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.50130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853624.50238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853624.50353: variable 'network_connections' from source: task vars 28023 1726853624.50377: variable 'interface0' from source: play vars 28023 1726853624.50465: variable 'interface0' from source: play vars 28023 1726853624.50482: variable 'interface0' from source: play vars 28023 1726853624.50546: variable 'interface0' from source: play vars 28023 1726853624.50573: variable 'interface1' from source: play vars 28023 1726853624.50632: variable 'interface1' from source: play vars 28023 1726853624.50645: variable 'interface1' from source: play vars 28023 1726853624.50713: variable 'interface1' from source: play vars 28023 1726853624.50800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853624.50994: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853624.51034: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853624.51083: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853624.51121: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853624.51210: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853624.51213: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853624.51237: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.51286: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853624.51578: variable '__network_team_connections_defined' from source: role '' defaults 28023 1726853624.51868: variable 'network_connections' from source: task vars 28023 1726853624.52011: variable 'interface0' from source: play vars 28023 1726853624.52052: variable 'interface0' from source: play vars 28023 1726853624.52378: variable 'interface0' from source: play vars 28023 1726853624.52381: variable 'interface0' from source: play vars 28023 1726853624.52383: variable 'interface1' from source: play vars 28023 1726853624.52410: variable 'interface1' from source: play vars 28023 1726853624.52430: variable 'interface1' from source: play vars 28023 1726853624.52525: variable 'interface1' from source: play vars 28023 1726853624.52622: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28023 1726853624.52666: when evaluation is False, skipping this task 28023 1726853624.52683: _execute() done 28023 1726853624.52691: dumping result to json 28023 1726853624.52701: done dumping result, returning 28023 1726853624.52713: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-fdb6-dad7-000000000022] 28023 1726853624.52725: sending task result for task 02083763-bbaf-fdb6-dad7-000000000022 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28023 1726853624.53065: no more pending results, returning what we have 28023 1726853624.53069: results queue empty 28023 1726853624.53070: checking for any_errors_fatal 28023 1726853624.53079: done checking for any_errors_fatal 28023 1726853624.53080: checking for max_fail_percentage 28023 1726853624.53082: done checking for max_fail_percentage 28023 1726853624.53083: checking to see if all hosts have failed and the running result is not ok 28023 1726853624.53084: done checking to see if all hosts have failed 28023 1726853624.53085: getting the remaining hosts for this loop 28023 1726853624.53087: done getting the remaining hosts for this loop 28023 1726853624.53090: getting the next task for host managed_node3 28023 1726853624.53097: done getting next task for host managed_node3 28023 1726853624.53101: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28023 1726853624.53105: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853624.53120: getting variables 28023 1726853624.53121: in VariableManager get_vars() 28023 1726853624.53170: Calling all_inventory to load vars for managed_node3 28023 1726853624.53179: Calling groups_inventory to load vars for managed_node3 28023 1726853624.53183: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853624.53203: Calling all_plugins_play to load vars for managed_node3 28023 1726853624.53206: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853624.53209: Calling groups_plugins_play to load vars for managed_node3 28023 1726853624.53884: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000022 28023 1726853624.53888: WORKER PROCESS EXITING 28023 1726853624.54874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853624.56464: done with get_vars() 28023 1726853624.56494: done getting variables 28023 1726853624.56548: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:33:44 -0400 (0:00:00.112) 0:00:16.649 ****** 28023 1726853624.56587: entering _queue_task() for managed_node3/package 28023 1726853624.57329: worker is 1 (out of 1 available) 28023 1726853624.57343: exiting _queue_task() for managed_node3/package 28023 1726853624.57359: done queuing things up, now waiting for results queue to drain 28023 1726853624.57361: waiting for pending results... 28023 1726853624.57895: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 28023 1726853624.58282: in run() - task 02083763-bbaf-fdb6-dad7-000000000023 28023 1726853624.58357: variable 'ansible_search_path' from source: unknown 28023 1726853624.58385: variable 'ansible_search_path' from source: unknown 28023 1726853624.58472: calling self._execute() 28023 1726853624.58784: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853624.58797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853624.58820: variable 'omit' from source: magic vars 28023 1726853624.59634: variable 'ansible_distribution_major_version' from source: facts 28023 1726853624.59655: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853624.59861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853624.60138: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853624.60195: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853624.60230: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853624.60476: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853624.60479: variable 'network_packages' from source: role '' defaults 28023 1726853624.60543: variable '__network_provider_setup' from source: role '' defaults 28023 1726853624.60562: variable '__network_service_name_default_nm' from source: role '' defaults 28023 1726853624.60638: variable '__network_service_name_default_nm' from source: role '' defaults 28023 1726853624.60652: variable '__network_packages_default_nm' from source: role '' defaults 28023 1726853624.60721: variable '__network_packages_default_nm' from source: role '' defaults 28023 1726853624.60912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853624.63738: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853624.63943: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853624.64069: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853624.64075: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853624.64077: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853624.64239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853624.64476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853624.64479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.64481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853624.64483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853624.64649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853624.64685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853624.64790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.64842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853624.64866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853624.65439: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28023 1726853624.65704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853624.65812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853624.65843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.65935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853624.66076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853624.66149: variable 'ansible_python' from source: facts 28023 1726853624.66262: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28023 1726853624.66473: variable '__network_wpa_supplicant_required' from source: role '' defaults 28023 1726853624.66766: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28023 1726853624.66904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853624.66933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853624.66967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.67180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853624.67183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853624.67308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853624.67332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853624.67363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.67460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853624.67483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853624.67852: variable 'network_connections' from source: task vars 28023 1726853624.67870: variable 'interface0' from source: play vars 28023 1726853624.68097: variable 'interface0' from source: play vars 28023 1726853624.68111: variable 'interface0' from source: play vars 28023 1726853624.68478: variable 'interface0' from source: play vars 28023 1726853624.68481: variable 'interface1' from source: play vars 28023 1726853624.68631: variable 'interface1' from source: play vars 28023 1726853624.68646: variable 'interface1' from source: play vars 28023 1726853624.68822: variable 'interface1' from source: play vars 28023 1726853624.68961: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853624.69067: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853624.69104: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.69179: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853624.69306: variable '__network_wireless_connections_defined' from source: role '' defaults 28023 1726853624.69886: variable 'network_connections' from source: task vars 28023 1726853624.70011: variable 'interface0' from source: play vars 28023 1726853624.70130: variable 'interface0' from source: play vars 28023 1726853624.70338: variable 'interface0' from source: play vars 28023 1726853624.70388: variable 'interface0' from source: play vars 28023 1726853624.70467: variable 'interface1' from source: play vars 28023 1726853624.70694: variable 'interface1' from source: play vars 28023 1726853624.70709: variable 'interface1' from source: play vars 28023 1726853624.70925: variable 'interface1' from source: play vars 28023 1726853624.71034: variable '__network_packages_default_wireless' from source: role '' defaults 28023 1726853624.71378: variable '__network_wireless_connections_defined' from source: role '' defaults 28023 1726853624.72001: variable 'network_connections' from source: task vars 28023 1726853624.72011: variable 'interface0' from source: play vars 28023 1726853624.72180: variable 'interface0' from source: play vars 28023 1726853624.72192: variable 'interface0' from source: play vars 28023 1726853624.72265: variable 'interface0' from source: play vars 28023 1726853624.72334: variable 'interface1' from source: play vars 28023 1726853624.72491: variable 'interface1' from source: play vars 28023 1726853624.72502: variable 'interface1' from source: play vars 28023 1726853624.72609: variable 'interface1' from source: play vars 28023 1726853624.72678: variable '__network_packages_default_team' from source: role '' defaults 28023 1726853624.72832: variable '__network_team_connections_defined' from source: role '' defaults 28023 1726853624.73294: variable 'network_connections' from source: task vars 28023 1726853624.73305: variable 'interface0' from source: play vars 28023 1726853624.73375: variable 'interface0' from source: play vars 28023 1726853624.73387: variable 'interface0' from source: play vars 28023 1726853624.73459: variable 'interface0' from source: play vars 28023 1726853624.73492: variable 'interface1' from source: play vars 28023 1726853624.73576: variable 'interface1' from source: play vars 28023 1726853624.73589: variable 'interface1' from source: play vars 28023 1726853624.73694: variable 'interface1' from source: play vars 28023 1726853624.73789: variable '__network_service_name_default_initscripts' from source: role '' defaults 28023 1726853624.73892: variable '__network_service_name_default_initscripts' from source: role '' defaults 28023 1726853624.73928: variable '__network_packages_default_initscripts' from source: role '' defaults 28023 1726853624.73998: variable '__network_packages_default_initscripts' from source: role '' defaults 28023 1726853624.74254: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28023 1726853624.74764: variable 'network_connections' from source: task vars 28023 1726853624.74778: variable 'interface0' from source: play vars 28023 1726853624.74855: variable 'interface0' from source: play vars 28023 1726853624.74882: variable 'interface0' from source: play vars 28023 1726853624.75025: variable 'interface0' from source: play vars 28023 1726853624.75028: variable 'interface1' from source: play vars 28023 1726853624.75031: variable 'interface1' from source: play vars 28023 1726853624.75040: variable 'interface1' from source: play vars 28023 1726853624.75103: variable 'interface1' from source: play vars 28023 1726853624.75120: variable 'ansible_distribution' from source: facts 28023 1726853624.75132: variable '__network_rh_distros' from source: role '' defaults 28023 1726853624.75143: variable 'ansible_distribution_major_version' from source: facts 28023 1726853624.75177: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28023 1726853624.75496: variable 'ansible_distribution' from source: facts 28023 1726853624.75499: variable '__network_rh_distros' from source: role '' defaults 28023 1726853624.75510: variable 'ansible_distribution_major_version' from source: facts 28023 1726853624.75527: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28023 1726853624.75690: variable 'ansible_distribution' from source: facts 28023 1726853624.75694: variable '__network_rh_distros' from source: role '' defaults 28023 1726853624.75776: variable 'ansible_distribution_major_version' from source: facts 28023 1726853624.75779: variable 'network_provider' from source: set_fact 28023 1726853624.75781: variable 'ansible_facts' from source: unknown 28023 1726853624.76309: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28023 1726853624.76319: when evaluation is False, skipping this task 28023 1726853624.76322: _execute() done 28023 1726853624.76325: dumping result to json 28023 1726853624.76327: done dumping result, returning 28023 1726853624.76335: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-fdb6-dad7-000000000023] 28023 1726853624.76339: sending task result for task 02083763-bbaf-fdb6-dad7-000000000023 28023 1726853624.76431: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000023 28023 1726853624.76434: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28023 1726853624.76500: no more pending results, returning what we have 28023 1726853624.76504: results queue empty 28023 1726853624.76509: checking for any_errors_fatal 28023 1726853624.76517: done checking for any_errors_fatal 28023 1726853624.76518: checking for max_fail_percentage 28023 1726853624.76519: done checking for max_fail_percentage 28023 1726853624.76520: checking to see if all hosts have failed and the running result is not ok 28023 1726853624.76521: done checking to see if all hosts have failed 28023 1726853624.76521: getting the remaining hosts for this loop 28023 1726853624.76523: done getting the remaining hosts for this loop 28023 1726853624.76526: getting the next task for host managed_node3 28023 1726853624.76533: done getting next task for host managed_node3 28023 1726853624.76537: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28023 1726853624.76540: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853624.76555: getting variables 28023 1726853624.76557: in VariableManager get_vars() 28023 1726853624.76599: Calling all_inventory to load vars for managed_node3 28023 1726853624.76601: Calling groups_inventory to load vars for managed_node3 28023 1726853624.76603: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853624.76613: Calling all_plugins_play to load vars for managed_node3 28023 1726853624.76616: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853624.76618: Calling groups_plugins_play to load vars for managed_node3 28023 1726853624.77872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853624.79507: done with get_vars() 28023 1726853624.79541: done getting variables 28023 1726853624.79608: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:33:44 -0400 (0:00:00.230) 0:00:16.880 ****** 28023 1726853624.79646: entering _queue_task() for managed_node3/package 28023 1726853624.80211: worker is 1 (out of 1 available) 28023 1726853624.80222: exiting _queue_task() for managed_node3/package 28023 1726853624.80232: done queuing things up, now waiting for results queue to drain 28023 1726853624.80234: waiting for pending results... 28023 1726853624.80364: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28023 1726853624.80468: in run() - task 02083763-bbaf-fdb6-dad7-000000000024 28023 1726853624.80491: variable 'ansible_search_path' from source: unknown 28023 1726853624.80570: variable 'ansible_search_path' from source: unknown 28023 1726853624.80575: calling self._execute() 28023 1726853624.80657: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853624.80676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853624.80695: variable 'omit' from source: magic vars 28023 1726853624.81198: variable 'ansible_distribution_major_version' from source: facts 28023 1726853624.81235: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853624.81375: variable 'network_state' from source: role '' defaults 28023 1726853624.81394: Evaluated conditional (network_state != {}): False 28023 1726853624.81402: when evaluation is False, skipping this task 28023 1726853624.81409: _execute() done 28023 1726853624.81438: dumping result to json 28023 1726853624.81441: done dumping result, returning 28023 1726853624.81443: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-fdb6-dad7-000000000024] 28023 1726853624.81445: sending task result for task 02083763-bbaf-fdb6-dad7-000000000024 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28023 1726853624.81702: no more pending results, returning what we have 28023 1726853624.81707: results queue empty 28023 1726853624.81708: checking for any_errors_fatal 28023 1726853624.81716: done checking for any_errors_fatal 28023 1726853624.81717: checking for max_fail_percentage 28023 1726853624.81719: done checking for max_fail_percentage 28023 1726853624.81720: checking to see if all hosts have failed and the running result is not ok 28023 1726853624.81721: done checking to see if all hosts have failed 28023 1726853624.81722: getting the remaining hosts for this loop 28023 1726853624.81723: done getting the remaining hosts for this loop 28023 1726853624.81727: getting the next task for host managed_node3 28023 1726853624.81734: done getting next task for host managed_node3 28023 1726853624.81738: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28023 1726853624.81742: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853624.81764: getting variables 28023 1726853624.81766: in VariableManager get_vars() 28023 1726853624.81816: Calling all_inventory to load vars for managed_node3 28023 1726853624.81819: Calling groups_inventory to load vars for managed_node3 28023 1726853624.81822: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853624.81990: Calling all_plugins_play to load vars for managed_node3 28023 1726853624.81994: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853624.81997: Calling groups_plugins_play to load vars for managed_node3 28023 1726853624.82601: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000024 28023 1726853624.82609: WORKER PROCESS EXITING 28023 1726853624.83419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853624.85310: done with get_vars() 28023 1726853624.85333: done getting variables 28023 1726853624.85401: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:33:44 -0400 (0:00:00.057) 0:00:16.938 ****** 28023 1726853624.85435: entering _queue_task() for managed_node3/package 28023 1726853624.86021: worker is 1 (out of 1 available) 28023 1726853624.86035: exiting _queue_task() for managed_node3/package 28023 1726853624.86048: done queuing things up, now waiting for results queue to drain 28023 1726853624.86049: waiting for pending results... 28023 1726853624.86322: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28023 1726853624.86459: in run() - task 02083763-bbaf-fdb6-dad7-000000000025 28023 1726853624.86482: variable 'ansible_search_path' from source: unknown 28023 1726853624.86490: variable 'ansible_search_path' from source: unknown 28023 1726853624.86539: calling self._execute() 28023 1726853624.86652: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853624.86665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853624.86682: variable 'omit' from source: magic vars 28023 1726853624.87085: variable 'ansible_distribution_major_version' from source: facts 28023 1726853624.87122: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853624.87249: variable 'network_state' from source: role '' defaults 28023 1726853624.87285: Evaluated conditional (network_state != {}): False 28023 1726853624.87294: when evaluation is False, skipping this task 28023 1726853624.87302: _execute() done 28023 1726853624.87379: dumping result to json 28023 1726853624.87383: done dumping result, returning 28023 1726853624.87386: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-fdb6-dad7-000000000025] 28023 1726853624.87388: sending task result for task 02083763-bbaf-fdb6-dad7-000000000025 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28023 1726853624.87526: no more pending results, returning what we have 28023 1726853624.87530: results queue empty 28023 1726853624.87531: checking for any_errors_fatal 28023 1726853624.87544: done checking for any_errors_fatal 28023 1726853624.87545: checking for max_fail_percentage 28023 1726853624.87547: done checking for max_fail_percentage 28023 1726853624.87548: checking to see if all hosts have failed and the running result is not ok 28023 1726853624.87549: done checking to see if all hosts have failed 28023 1726853624.87549: getting the remaining hosts for this loop 28023 1726853624.87551: done getting the remaining hosts for this loop 28023 1726853624.87555: getting the next task for host managed_node3 28023 1726853624.87562: done getting next task for host managed_node3 28023 1726853624.87566: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28023 1726853624.87570: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853624.87590: getting variables 28023 1726853624.87591: in VariableManager get_vars() 28023 1726853624.87636: Calling all_inventory to load vars for managed_node3 28023 1726853624.87640: Calling groups_inventory to load vars for managed_node3 28023 1726853624.87642: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853624.87657: Calling all_plugins_play to load vars for managed_node3 28023 1726853624.87660: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853624.87664: Calling groups_plugins_play to load vars for managed_node3 28023 1726853624.88404: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000025 28023 1726853624.88408: WORKER PROCESS EXITING 28023 1726853624.89498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853624.91201: done with get_vars() 28023 1726853624.91236: done getting variables 28023 1726853624.91342: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:33:44 -0400 (0:00:00.059) 0:00:16.997 ****** 28023 1726853624.91378: entering _queue_task() for managed_node3/service 28023 1726853624.91380: Creating lock for service 28023 1726853624.91739: worker is 1 (out of 1 available) 28023 1726853624.91751: exiting _queue_task() for managed_node3/service 28023 1726853624.91765: done queuing things up, now waiting for results queue to drain 28023 1726853624.91767: waiting for pending results... 28023 1726853624.92049: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28023 1726853624.92213: in run() - task 02083763-bbaf-fdb6-dad7-000000000026 28023 1726853624.92236: variable 'ansible_search_path' from source: unknown 28023 1726853624.92246: variable 'ansible_search_path' from source: unknown 28023 1726853624.92290: calling self._execute() 28023 1726853624.92397: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853624.92416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853624.92433: variable 'omit' from source: magic vars 28023 1726853624.92821: variable 'ansible_distribution_major_version' from source: facts 28023 1726853624.92851: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853624.93070: variable '__network_wireless_connections_defined' from source: role '' defaults 28023 1726853624.93182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853624.95477: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853624.95554: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853624.95605: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853624.95642: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853624.95698: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853624.95778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853624.95808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853624.95837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.95882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853624.95896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853624.95939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853624.95963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853624.95993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.96029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853624.96043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853624.96087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853624.96113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853624.96138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.96178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853624.96192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853624.96386: variable 'network_connections' from source: task vars 28023 1726853624.96397: variable 'interface0' from source: play vars 28023 1726853624.96498: variable 'interface0' from source: play vars 28023 1726853624.96515: variable 'interface0' from source: play vars 28023 1726853624.96583: variable 'interface0' from source: play vars 28023 1726853624.96592: variable 'interface1' from source: play vars 28023 1726853624.96638: variable 'interface1' from source: play vars 28023 1726853624.96641: variable 'interface1' from source: play vars 28023 1726853624.96688: variable 'interface1' from source: play vars 28023 1726853624.96737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853624.97127: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853624.97154: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853624.97180: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853624.97204: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853624.97233: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853624.97248: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853624.97266: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853624.97286: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853624.97336: variable '__network_team_connections_defined' from source: role '' defaults 28023 1726853624.97488: variable 'network_connections' from source: task vars 28023 1726853624.97492: variable 'interface0' from source: play vars 28023 1726853624.97536: variable 'interface0' from source: play vars 28023 1726853624.97542: variable 'interface0' from source: play vars 28023 1726853624.97585: variable 'interface0' from source: play vars 28023 1726853624.97594: variable 'interface1' from source: play vars 28023 1726853624.97638: variable 'interface1' from source: play vars 28023 1726853624.97650: variable 'interface1' from source: play vars 28023 1726853624.97688: variable 'interface1' from source: play vars 28023 1726853624.97714: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28023 1726853624.97717: when evaluation is False, skipping this task 28023 1726853624.97720: _execute() done 28023 1726853624.97724: dumping result to json 28023 1726853624.97726: done dumping result, returning 28023 1726853624.97734: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-fdb6-dad7-000000000026] 28023 1726853624.97740: sending task result for task 02083763-bbaf-fdb6-dad7-000000000026 28023 1726853624.97826: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000026 28023 1726853624.97829: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28023 1726853624.97893: no more pending results, returning what we have 28023 1726853624.97897: results queue empty 28023 1726853624.97897: checking for any_errors_fatal 28023 1726853624.97904: done checking for any_errors_fatal 28023 1726853624.97904: checking for max_fail_percentage 28023 1726853624.97906: done checking for max_fail_percentage 28023 1726853624.97907: checking to see if all hosts have failed and the running result is not ok 28023 1726853624.97908: done checking to see if all hosts have failed 28023 1726853624.97908: getting the remaining hosts for this loop 28023 1726853624.97910: done getting the remaining hosts for this loop 28023 1726853624.97913: getting the next task for host managed_node3 28023 1726853624.97920: done getting next task for host managed_node3 28023 1726853624.97923: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28023 1726853624.97926: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853624.97940: getting variables 28023 1726853624.97941: in VariableManager get_vars() 28023 1726853624.97988: Calling all_inventory to load vars for managed_node3 28023 1726853624.97990: Calling groups_inventory to load vars for managed_node3 28023 1726853624.97993: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853624.98003: Calling all_plugins_play to load vars for managed_node3 28023 1726853624.98005: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853624.98008: Calling groups_plugins_play to load vars for managed_node3 28023 1726853624.99225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853625.00600: done with get_vars() 28023 1726853625.00628: done getting variables 28023 1726853625.00710: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:33:45 -0400 (0:00:00.093) 0:00:17.091 ****** 28023 1726853625.00759: entering _queue_task() for managed_node3/service 28023 1726853625.01130: worker is 1 (out of 1 available) 28023 1726853625.01143: exiting _queue_task() for managed_node3/service 28023 1726853625.01159: done queuing things up, now waiting for results queue to drain 28023 1726853625.01161: waiting for pending results... 28023 1726853625.01522: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28023 1726853625.01727: in run() - task 02083763-bbaf-fdb6-dad7-000000000027 28023 1726853625.01731: variable 'ansible_search_path' from source: unknown 28023 1726853625.01734: variable 'ansible_search_path' from source: unknown 28023 1726853625.01766: calling self._execute() 28023 1726853625.01940: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853625.01944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853625.01947: variable 'omit' from source: magic vars 28023 1726853625.02337: variable 'ansible_distribution_major_version' from source: facts 28023 1726853625.02360: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853625.02807: variable 'network_provider' from source: set_fact 28023 1726853625.02821: variable 'network_state' from source: role '' defaults 28023 1726853625.02839: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28023 1726853625.02851: variable 'omit' from source: magic vars 28023 1726853625.02931: variable 'omit' from source: magic vars 28023 1726853625.02970: variable 'network_service_name' from source: role '' defaults 28023 1726853625.03060: variable 'network_service_name' from source: role '' defaults 28023 1726853625.03207: variable '__network_provider_setup' from source: role '' defaults 28023 1726853625.03220: variable '__network_service_name_default_nm' from source: role '' defaults 28023 1726853625.03304: variable '__network_service_name_default_nm' from source: role '' defaults 28023 1726853625.03321: variable '__network_packages_default_nm' from source: role '' defaults 28023 1726853625.03401: variable '__network_packages_default_nm' from source: role '' defaults 28023 1726853625.03646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853625.06027: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853625.06084: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853625.06115: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853625.06144: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853625.06166: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853625.06227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853625.06252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853625.06274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853625.06300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853625.06311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853625.06346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853625.06363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853625.06383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853625.06408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853625.06418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853625.06575: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28023 1726853625.06653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853625.06674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853625.06692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853625.06717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853625.06727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853625.06795: variable 'ansible_python' from source: facts 28023 1726853625.06812: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28023 1726853625.06872: variable '__network_wpa_supplicant_required' from source: role '' defaults 28023 1726853625.06928: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28023 1726853625.07013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853625.07030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853625.07047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853625.07075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853625.07086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853625.07120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853625.07140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853625.07156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853625.07184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853625.07195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853625.07289: variable 'network_connections' from source: task vars 28023 1726853625.07296: variable 'interface0' from source: play vars 28023 1726853625.07350: variable 'interface0' from source: play vars 28023 1726853625.07358: variable 'interface0' from source: play vars 28023 1726853625.07411: variable 'interface0' from source: play vars 28023 1726853625.07432: variable 'interface1' from source: play vars 28023 1726853625.07486: variable 'interface1' from source: play vars 28023 1726853625.07494: variable 'interface1' from source: play vars 28023 1726853625.07544: variable 'interface1' from source: play vars 28023 1726853625.07625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853625.07759: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853625.07799: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853625.07829: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853625.07858: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853625.07906: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853625.07927: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853625.07948: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853625.07975: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853625.08013: variable '__network_wireless_connections_defined' from source: role '' defaults 28023 1726853625.08188: variable 'network_connections' from source: task vars 28023 1726853625.08195: variable 'interface0' from source: play vars 28023 1726853625.08247: variable 'interface0' from source: play vars 28023 1726853625.08256: variable 'interface0' from source: play vars 28023 1726853625.08314: variable 'interface0' from source: play vars 28023 1726853625.08331: variable 'interface1' from source: play vars 28023 1726853625.08384: variable 'interface1' from source: play vars 28023 1726853625.08392: variable 'interface1' from source: play vars 28023 1726853625.08444: variable 'interface1' from source: play vars 28023 1726853625.08488: variable '__network_packages_default_wireless' from source: role '' defaults 28023 1726853625.08543: variable '__network_wireless_connections_defined' from source: role '' defaults 28023 1726853625.08737: variable 'network_connections' from source: task vars 28023 1726853625.08740: variable 'interface0' from source: play vars 28023 1726853625.08796: variable 'interface0' from source: play vars 28023 1726853625.08801: variable 'interface0' from source: play vars 28023 1726853625.08848: variable 'interface0' from source: play vars 28023 1726853625.08859: variable 'interface1' from source: play vars 28023 1726853625.08910: variable 'interface1' from source: play vars 28023 1726853625.08915: variable 'interface1' from source: play vars 28023 1726853625.08966: variable 'interface1' from source: play vars 28023 1726853625.08990: variable '__network_packages_default_team' from source: role '' defaults 28023 1726853625.09042: variable '__network_team_connections_defined' from source: role '' defaults 28023 1726853625.09230: variable 'network_connections' from source: task vars 28023 1726853625.09233: variable 'interface0' from source: play vars 28023 1726853625.09285: variable 'interface0' from source: play vars 28023 1726853625.09296: variable 'interface0' from source: play vars 28023 1726853625.09340: variable 'interface0' from source: play vars 28023 1726853625.09349: variable 'interface1' from source: play vars 28023 1726853625.09401: variable 'interface1' from source: play vars 28023 1726853625.09404: variable 'interface1' from source: play vars 28023 1726853625.09455: variable 'interface1' from source: play vars 28023 1726853625.09502: variable '__network_service_name_default_initscripts' from source: role '' defaults 28023 1726853625.09545: variable '__network_service_name_default_initscripts' from source: role '' defaults 28023 1726853625.09551: variable '__network_packages_default_initscripts' from source: role '' defaults 28023 1726853625.09595: variable '__network_packages_default_initscripts' from source: role '' defaults 28023 1726853625.09740: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28023 1726853625.10046: variable 'network_connections' from source: task vars 28023 1726853625.10049: variable 'interface0' from source: play vars 28023 1726853625.10097: variable 'interface0' from source: play vars 28023 1726853625.10102: variable 'interface0' from source: play vars 28023 1726853625.10142: variable 'interface0' from source: play vars 28023 1726853625.10151: variable 'interface1' from source: play vars 28023 1726853625.10198: variable 'interface1' from source: play vars 28023 1726853625.10203: variable 'interface1' from source: play vars 28023 1726853625.10244: variable 'interface1' from source: play vars 28023 1726853625.10253: variable 'ansible_distribution' from source: facts 28023 1726853625.10256: variable '__network_rh_distros' from source: role '' defaults 28023 1726853625.10264: variable 'ansible_distribution_major_version' from source: facts 28023 1726853625.10284: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28023 1726853625.10398: variable 'ansible_distribution' from source: facts 28023 1726853625.10401: variable '__network_rh_distros' from source: role '' defaults 28023 1726853625.10404: variable 'ansible_distribution_major_version' from source: facts 28023 1726853625.10416: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28023 1726853625.10528: variable 'ansible_distribution' from source: facts 28023 1726853625.10532: variable '__network_rh_distros' from source: role '' defaults 28023 1726853625.10535: variable 'ansible_distribution_major_version' from source: facts 28023 1726853625.10565: variable 'network_provider' from source: set_fact 28023 1726853625.10583: variable 'omit' from source: magic vars 28023 1726853625.10608: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853625.10628: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853625.10645: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853625.10657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853625.10670: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853625.10693: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853625.10696: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853625.10698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853625.10767: Set connection var ansible_shell_type to sh 28023 1726853625.10775: Set connection var ansible_shell_executable to /bin/sh 28023 1726853625.10781: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853625.10786: Set connection var ansible_connection to ssh 28023 1726853625.10790: Set connection var ansible_pipelining to False 28023 1726853625.10795: Set connection var ansible_timeout to 10 28023 1726853625.10821: variable 'ansible_shell_executable' from source: unknown 28023 1726853625.10824: variable 'ansible_connection' from source: unknown 28023 1726853625.10827: variable 'ansible_module_compression' from source: unknown 28023 1726853625.10829: variable 'ansible_shell_type' from source: unknown 28023 1726853625.10831: variable 'ansible_shell_executable' from source: unknown 28023 1726853625.10833: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853625.10835: variable 'ansible_pipelining' from source: unknown 28023 1726853625.10839: variable 'ansible_timeout' from source: unknown 28023 1726853625.10841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853625.10913: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853625.10922: variable 'omit' from source: magic vars 28023 1726853625.10927: starting attempt loop 28023 1726853625.10930: running the handler 28023 1726853625.10987: variable 'ansible_facts' from source: unknown 28023 1726853625.11450: _low_level_execute_command(): starting 28023 1726853625.11454: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853625.11946: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853625.11979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853625.11982: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853625.11985: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853625.11987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853625.12040: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853625.12043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853625.12045: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853625.12120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853625.13819: stdout chunk (state=3): >>>/root <<< 28023 1726853625.13921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853625.13950: stderr chunk (state=3): >>><<< 28023 1726853625.13953: stdout chunk (state=3): >>><<< 28023 1726853625.13976: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853625.13989: _low_level_execute_command(): starting 28023 1726853625.13994: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853625.1397564-28853-62805320735988 `" && echo ansible-tmp-1726853625.1397564-28853-62805320735988="` echo /root/.ansible/tmp/ansible-tmp-1726853625.1397564-28853-62805320735988 `" ) && sleep 0' 28023 1726853625.14445: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853625.14449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853625.14451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 28023 1726853625.14453: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853625.14455: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853625.14508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853625.14515: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853625.14518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853625.14577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853625.16531: stdout chunk (state=3): >>>ansible-tmp-1726853625.1397564-28853-62805320735988=/root/.ansible/tmp/ansible-tmp-1726853625.1397564-28853-62805320735988 <<< 28023 1726853625.16662: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853625.16665: stdout chunk (state=3): >>><<< 28023 1726853625.16673: stderr chunk (state=3): >>><<< 28023 1726853625.16691: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853625.1397564-28853-62805320735988=/root/.ansible/tmp/ansible-tmp-1726853625.1397564-28853-62805320735988 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853625.16718: variable 'ansible_module_compression' from source: unknown 28023 1726853625.16760: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 28023 1726853625.16764: ANSIBALLZ: Acquiring lock 28023 1726853625.16767: ANSIBALLZ: Lock acquired: 139729396667488 28023 1726853625.16774: ANSIBALLZ: Creating module 28023 1726853625.42626: ANSIBALLZ: Writing module into payload 28023 1726853625.42692: ANSIBALLZ: Writing module 28023 1726853625.42712: ANSIBALLZ: Renaming module 28023 1726853625.42719: ANSIBALLZ: Done creating module 28023 1726853625.42759: variable 'ansible_facts' from source: unknown 28023 1726853625.42975: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853625.1397564-28853-62805320735988/AnsiballZ_systemd.py 28023 1726853625.43184: Sending initial data 28023 1726853625.43188: Sent initial data (155 bytes) 28023 1726853625.44054: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853625.44139: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853625.44184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853625.44201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853625.44223: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853625.44315: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853625.46219: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853625.46276: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853625.46342: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpuaid5hhq /root/.ansible/tmp/ansible-tmp-1726853625.1397564-28853-62805320735988/AnsiballZ_systemd.py <<< 28023 1726853625.46346: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853625.1397564-28853-62805320735988/AnsiballZ_systemd.py" <<< 28023 1726853625.46494: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpuaid5hhq" to remote "/root/.ansible/tmp/ansible-tmp-1726853625.1397564-28853-62805320735988/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853625.1397564-28853-62805320735988/AnsiballZ_systemd.py" <<< 28023 1726853625.48774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853625.48851: stderr chunk (state=3): >>><<< 28023 1726853625.48866: stdout chunk (state=3): >>><<< 28023 1726853625.48936: done transferring module to remote 28023 1726853625.48951: _low_level_execute_command(): starting 28023 1726853625.48963: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853625.1397564-28853-62805320735988/ /root/.ansible/tmp/ansible-tmp-1726853625.1397564-28853-62805320735988/AnsiballZ_systemd.py && sleep 0' 28023 1726853625.49535: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853625.49552: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853625.49560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853625.49587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853625.49590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853625.49696: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28023 1726853625.49699: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 28023 1726853625.49703: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853625.49705: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853625.49713: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853625.49732: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853625.49819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853625.51779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853625.51782: stdout chunk (state=3): >>><<< 28023 1726853625.51785: stderr chunk (state=3): >>><<< 28023 1726853625.51787: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853625.51789: _low_level_execute_command(): starting 28023 1726853625.51792: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853625.1397564-28853-62805320735988/AnsiballZ_systemd.py && sleep 0' 28023 1726853625.52340: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853625.52386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853625.52442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853625.52452: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853625.52496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853625.52569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853625.82579: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10694656", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3311775744", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1868087000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 28023 1726853625.82605: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysini<<< 28023 1726853625.82616: stdout chunk (state=3): >>>t.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28023 1726853625.84878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853625.84882: stderr chunk (state=3): >>><<< 28023 1726853625.84885: stdout chunk (state=3): >>><<< 28023 1726853625.84889: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10694656", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3311775744", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1868087000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysinit.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853625.84899: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853625.1397564-28853-62805320735988/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853625.84923: _low_level_execute_command(): starting 28023 1726853625.84934: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853625.1397564-28853-62805320735988/ > /dev/null 2>&1 && sleep 0' 28023 1726853625.85559: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853625.85595: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853625.85607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28023 1726853625.85618: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853625.85683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853625.85695: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853625.85764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853625.87701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853625.87705: stdout chunk (state=3): >>><<< 28023 1726853625.87708: stderr chunk (state=3): >>><<< 28023 1726853625.87803: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853625.87807: handler run complete 28023 1726853625.87809: attempt loop complete, returning result 28023 1726853625.87811: _execute() done 28023 1726853625.87813: dumping result to json 28023 1726853625.87825: done dumping result, returning 28023 1726853625.87836: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-fdb6-dad7-000000000027] 28023 1726853625.87839: sending task result for task 02083763-bbaf-fdb6-dad7-000000000027 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28023 1726853625.89076: no more pending results, returning what we have 28023 1726853625.89079: results queue empty 28023 1726853625.89080: checking for any_errors_fatal 28023 1726853625.89093: done checking for any_errors_fatal 28023 1726853625.89094: checking for max_fail_percentage 28023 1726853625.89096: done checking for max_fail_percentage 28023 1726853625.89097: checking to see if all hosts have failed and the running result is not ok 28023 1726853625.89098: done checking to see if all hosts have failed 28023 1726853625.89098: getting the remaining hosts for this loop 28023 1726853625.89100: done getting the remaining hosts for this loop 28023 1726853625.89103: getting the next task for host managed_node3 28023 1726853625.89108: done getting next task for host managed_node3 28023 1726853625.89112: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28023 1726853625.89114: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853625.89123: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000027 28023 1726853625.89126: WORKER PROCESS EXITING 28023 1726853625.89132: getting variables 28023 1726853625.89133: in VariableManager get_vars() 28023 1726853625.89167: Calling all_inventory to load vars for managed_node3 28023 1726853625.89170: Calling groups_inventory to load vars for managed_node3 28023 1726853625.89175: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853625.89185: Calling all_plugins_play to load vars for managed_node3 28023 1726853625.89188: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853625.89194: Calling groups_plugins_play to load vars for managed_node3 28023 1726853625.89994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853625.90848: done with get_vars() 28023 1726853625.90865: done getting variables 28023 1726853625.90910: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:33:45 -0400 (0:00:00.901) 0:00:17.993 ****** 28023 1726853625.90934: entering _queue_task() for managed_node3/service 28023 1726853625.91177: worker is 1 (out of 1 available) 28023 1726853625.91191: exiting _queue_task() for managed_node3/service 28023 1726853625.91203: done queuing things up, now waiting for results queue to drain 28023 1726853625.91204: waiting for pending results... 28023 1726853625.91381: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28023 1726853625.91476: in run() - task 02083763-bbaf-fdb6-dad7-000000000028 28023 1726853625.91489: variable 'ansible_search_path' from source: unknown 28023 1726853625.91492: variable 'ansible_search_path' from source: unknown 28023 1726853625.91521: calling self._execute() 28023 1726853625.91597: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853625.91601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853625.91610: variable 'omit' from source: magic vars 28023 1726853625.91885: variable 'ansible_distribution_major_version' from source: facts 28023 1726853625.91895: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853625.91978: variable 'network_provider' from source: set_fact 28023 1726853625.91982: Evaluated conditional (network_provider == "nm"): True 28023 1726853625.92041: variable '__network_wpa_supplicant_required' from source: role '' defaults 28023 1726853625.92108: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28023 1726853625.92225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853625.93885: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853625.93927: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853625.93956: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853625.93985: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853625.94010: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853625.94109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853625.94162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853625.94166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853625.94188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853625.94202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853625.94310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853625.94314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853625.94317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853625.94577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853625.94581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853625.94584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853625.94586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853625.94588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853625.94590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853625.94593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853625.94627: variable 'network_connections' from source: task vars 28023 1726853625.94640: variable 'interface0' from source: play vars 28023 1726853625.94715: variable 'interface0' from source: play vars 28023 1726853625.94725: variable 'interface0' from source: play vars 28023 1726853625.94787: variable 'interface0' from source: play vars 28023 1726853625.94799: variable 'interface1' from source: play vars 28023 1726853625.94882: variable 'interface1' from source: play vars 28023 1726853625.94888: variable 'interface1' from source: play vars 28023 1726853625.94946: variable 'interface1' from source: play vars 28023 1726853625.95022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853625.95144: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853625.95175: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853625.95197: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853625.95217: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853625.95250: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853625.95268: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853625.95298: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853625.95316: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853625.95355: variable '__network_wireless_connections_defined' from source: role '' defaults 28023 1726853625.95506: variable 'network_connections' from source: task vars 28023 1726853625.95510: variable 'interface0' from source: play vars 28023 1726853625.95567: variable 'interface0' from source: play vars 28023 1726853625.95570: variable 'interface0' from source: play vars 28023 1726853625.95616: variable 'interface0' from source: play vars 28023 1726853625.95625: variable 'interface1' from source: play vars 28023 1726853625.95669: variable 'interface1' from source: play vars 28023 1726853625.95675: variable 'interface1' from source: play vars 28023 1726853625.95718: variable 'interface1' from source: play vars 28023 1726853625.95752: Evaluated conditional (__network_wpa_supplicant_required): False 28023 1726853625.95755: when evaluation is False, skipping this task 28023 1726853625.95758: _execute() done 28023 1726853625.95763: dumping result to json 28023 1726853625.95766: done dumping result, returning 28023 1726853625.95775: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-fdb6-dad7-000000000028] 28023 1726853625.95780: sending task result for task 02083763-bbaf-fdb6-dad7-000000000028 28023 1726853625.95863: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000028 28023 1726853625.95865: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28023 1726853625.95910: no more pending results, returning what we have 28023 1726853625.95913: results queue empty 28023 1726853625.95914: checking for any_errors_fatal 28023 1726853625.95932: done checking for any_errors_fatal 28023 1726853625.95932: checking for max_fail_percentage 28023 1726853625.95934: done checking for max_fail_percentage 28023 1726853625.95935: checking to see if all hosts have failed and the running result is not ok 28023 1726853625.95936: done checking to see if all hosts have failed 28023 1726853625.95936: getting the remaining hosts for this loop 28023 1726853625.95938: done getting the remaining hosts for this loop 28023 1726853625.95942: getting the next task for host managed_node3 28023 1726853625.95948: done getting next task for host managed_node3 28023 1726853625.95951: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28023 1726853625.95954: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853625.95967: getting variables 28023 1726853625.95968: in VariableManager get_vars() 28023 1726853625.96012: Calling all_inventory to load vars for managed_node3 28023 1726853625.96015: Calling groups_inventory to load vars for managed_node3 28023 1726853625.96017: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853625.96028: Calling all_plugins_play to load vars for managed_node3 28023 1726853625.96030: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853625.96033: Calling groups_plugins_play to load vars for managed_node3 28023 1726853625.97053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853625.98182: done with get_vars() 28023 1726853625.98200: done getting variables 28023 1726853625.98245: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:33:45 -0400 (0:00:00.073) 0:00:18.066 ****** 28023 1726853625.98273: entering _queue_task() for managed_node3/service 28023 1726853625.98524: worker is 1 (out of 1 available) 28023 1726853625.98538: exiting _queue_task() for managed_node3/service 28023 1726853625.98550: done queuing things up, now waiting for results queue to drain 28023 1726853625.98552: waiting for pending results... 28023 1726853625.98729: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 28023 1726853625.98823: in run() - task 02083763-bbaf-fdb6-dad7-000000000029 28023 1726853625.98836: variable 'ansible_search_path' from source: unknown 28023 1726853625.98840: variable 'ansible_search_path' from source: unknown 28023 1726853625.98873: calling self._execute() 28023 1726853625.98948: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853625.98953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853625.98964: variable 'omit' from source: magic vars 28023 1726853625.99237: variable 'ansible_distribution_major_version' from source: facts 28023 1726853625.99247: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853625.99328: variable 'network_provider' from source: set_fact 28023 1726853625.99333: Evaluated conditional (network_provider == "initscripts"): False 28023 1726853625.99336: when evaluation is False, skipping this task 28023 1726853625.99339: _execute() done 28023 1726853625.99344: dumping result to json 28023 1726853625.99346: done dumping result, returning 28023 1726853625.99353: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-fdb6-dad7-000000000029] 28023 1726853625.99358: sending task result for task 02083763-bbaf-fdb6-dad7-000000000029 28023 1726853625.99445: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000029 28023 1726853625.99448: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28023 1726853625.99495: no more pending results, returning what we have 28023 1726853625.99499: results queue empty 28023 1726853625.99500: checking for any_errors_fatal 28023 1726853625.99512: done checking for any_errors_fatal 28023 1726853625.99513: checking for max_fail_percentage 28023 1726853625.99515: done checking for max_fail_percentage 28023 1726853625.99516: checking to see if all hosts have failed and the running result is not ok 28023 1726853625.99517: done checking to see if all hosts have failed 28023 1726853625.99518: getting the remaining hosts for this loop 28023 1726853625.99519: done getting the remaining hosts for this loop 28023 1726853625.99522: getting the next task for host managed_node3 28023 1726853625.99528: done getting next task for host managed_node3 28023 1726853625.99531: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28023 1726853625.99534: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853625.99550: getting variables 28023 1726853625.99551: in VariableManager get_vars() 28023 1726853625.99594: Calling all_inventory to load vars for managed_node3 28023 1726853625.99597: Calling groups_inventory to load vars for managed_node3 28023 1726853625.99599: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853625.99608: Calling all_plugins_play to load vars for managed_node3 28023 1726853625.99610: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853625.99612: Calling groups_plugins_play to load vars for managed_node3 28023 1726853626.00391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853626.01334: done with get_vars() 28023 1726853626.01350: done getting variables 28023 1726853626.01395: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:33:46 -0400 (0:00:00.031) 0:00:18.098 ****** 28023 1726853626.01421: entering _queue_task() for managed_node3/copy 28023 1726853626.01665: worker is 1 (out of 1 available) 28023 1726853626.01679: exiting _queue_task() for managed_node3/copy 28023 1726853626.01693: done queuing things up, now waiting for results queue to drain 28023 1726853626.01695: waiting for pending results... 28023 1726853626.01879: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28023 1726853626.01968: in run() - task 02083763-bbaf-fdb6-dad7-00000000002a 28023 1726853626.01983: variable 'ansible_search_path' from source: unknown 28023 1726853626.01986: variable 'ansible_search_path' from source: unknown 28023 1726853626.02015: calling self._execute() 28023 1726853626.02091: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853626.02095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853626.02104: variable 'omit' from source: magic vars 28023 1726853626.02380: variable 'ansible_distribution_major_version' from source: facts 28023 1726853626.02390: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853626.02468: variable 'network_provider' from source: set_fact 28023 1726853626.02473: Evaluated conditional (network_provider == "initscripts"): False 28023 1726853626.02476: when evaluation is False, skipping this task 28023 1726853626.02482: _execute() done 28023 1726853626.02484: dumping result to json 28023 1726853626.02487: done dumping result, returning 28023 1726853626.02497: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-fdb6-dad7-00000000002a] 28023 1726853626.02501: sending task result for task 02083763-bbaf-fdb6-dad7-00000000002a 28023 1726853626.02600: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000002a 28023 1726853626.02603: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28023 1726853626.02661: no more pending results, returning what we have 28023 1726853626.02665: results queue empty 28023 1726853626.02665: checking for any_errors_fatal 28023 1726853626.02675: done checking for any_errors_fatal 28023 1726853626.02675: checking for max_fail_percentage 28023 1726853626.02678: done checking for max_fail_percentage 28023 1726853626.02679: checking to see if all hosts have failed and the running result is not ok 28023 1726853626.02680: done checking to see if all hosts have failed 28023 1726853626.02681: getting the remaining hosts for this loop 28023 1726853626.02682: done getting the remaining hosts for this loop 28023 1726853626.02686: getting the next task for host managed_node3 28023 1726853626.02691: done getting next task for host managed_node3 28023 1726853626.02695: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28023 1726853626.02698: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853626.02711: getting variables 28023 1726853626.02713: in VariableManager get_vars() 28023 1726853626.02748: Calling all_inventory to load vars for managed_node3 28023 1726853626.02750: Calling groups_inventory to load vars for managed_node3 28023 1726853626.02752: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853626.02760: Calling all_plugins_play to load vars for managed_node3 28023 1726853626.02763: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853626.02765: Calling groups_plugins_play to load vars for managed_node3 28023 1726853626.03531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853626.04645: done with get_vars() 28023 1726853626.04668: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:33:46 -0400 (0:00:00.033) 0:00:18.131 ****** 28023 1726853626.04746: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 28023 1726853626.04747: Creating lock for fedora.linux_system_roles.network_connections 28023 1726853626.05113: worker is 1 (out of 1 available) 28023 1726853626.05125: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 28023 1726853626.05137: done queuing things up, now waiting for results queue to drain 28023 1726853626.05138: waiting for pending results... 28023 1726853626.05497: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28023 1726853626.05585: in run() - task 02083763-bbaf-fdb6-dad7-00000000002b 28023 1726853626.05590: variable 'ansible_search_path' from source: unknown 28023 1726853626.05592: variable 'ansible_search_path' from source: unknown 28023 1726853626.05614: calling self._execute() 28023 1726853626.05720: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853626.05730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853626.05775: variable 'omit' from source: magic vars 28023 1726853626.06108: variable 'ansible_distribution_major_version' from source: facts 28023 1726853626.06123: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853626.06132: variable 'omit' from source: magic vars 28023 1726853626.06193: variable 'omit' from source: magic vars 28023 1726853626.06348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853626.09124: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853626.09202: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853626.09205: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853626.09238: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853626.09300: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853626.09477: variable 'network_provider' from source: set_fact 28023 1726853626.09516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853626.09914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853626.09951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853626.10000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853626.10021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853626.10112: variable 'omit' from source: magic vars 28023 1726853626.10240: variable 'omit' from source: magic vars 28023 1726853626.10434: variable 'network_connections' from source: task vars 28023 1726853626.10455: variable 'interface0' from source: play vars 28023 1726853626.10596: variable 'interface0' from source: play vars 28023 1726853626.10667: variable 'interface0' from source: play vars 28023 1726853626.10736: variable 'interface0' from source: play vars 28023 1726853626.10754: variable 'interface1' from source: play vars 28023 1726853626.10933: variable 'interface1' from source: play vars 28023 1726853626.10936: variable 'interface1' from source: play vars 28023 1726853626.10937: variable 'interface1' from source: play vars 28023 1726853626.11104: variable 'omit' from source: magic vars 28023 1726853626.11119: variable '__lsr_ansible_managed' from source: task vars 28023 1726853626.11188: variable '__lsr_ansible_managed' from source: task vars 28023 1726853626.11464: Loaded config def from plugin (lookup/template) 28023 1726853626.11478: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28023 1726853626.11505: File lookup term: get_ansible_managed.j2 28023 1726853626.11512: variable 'ansible_search_path' from source: unknown 28023 1726853626.11519: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28023 1726853626.11533: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28023 1726853626.11552: variable 'ansible_search_path' from source: unknown 28023 1726853626.17592: variable 'ansible_managed' from source: unknown 28023 1726853626.17740: variable 'omit' from source: magic vars 28023 1726853626.17776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853626.17814: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853626.17842: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853626.17863: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853626.17880: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853626.17916: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853626.17925: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853626.17977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853626.18038: Set connection var ansible_shell_type to sh 28023 1726853626.18052: Set connection var ansible_shell_executable to /bin/sh 28023 1726853626.18063: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853626.18076: Set connection var ansible_connection to ssh 28023 1726853626.18085: Set connection var ansible_pipelining to False 28023 1726853626.18093: Set connection var ansible_timeout to 10 28023 1726853626.18123: variable 'ansible_shell_executable' from source: unknown 28023 1726853626.18129: variable 'ansible_connection' from source: unknown 28023 1726853626.18134: variable 'ansible_module_compression' from source: unknown 28023 1726853626.18178: variable 'ansible_shell_type' from source: unknown 28023 1726853626.18181: variable 'ansible_shell_executable' from source: unknown 28023 1726853626.18182: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853626.18184: variable 'ansible_pipelining' from source: unknown 28023 1726853626.18186: variable 'ansible_timeout' from source: unknown 28023 1726853626.18194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853626.18676: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28023 1726853626.18680: variable 'omit' from source: magic vars 28023 1726853626.18684: starting attempt loop 28023 1726853626.18687: running the handler 28023 1726853626.18689: _low_level_execute_command(): starting 28023 1726853626.18691: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853626.19797: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853626.19983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853626.19996: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853626.20123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853626.21793: stdout chunk (state=3): >>>/root <<< 28023 1726853626.22074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853626.22079: stdout chunk (state=3): >>><<< 28023 1726853626.22082: stderr chunk (state=3): >>><<< 28023 1726853626.22141: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853626.22220: _low_level_execute_command(): starting 28023 1726853626.22612: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853626.2219932-28914-116232850429448 `" && echo ansible-tmp-1726853626.2219932-28914-116232850429448="` echo /root/.ansible/tmp/ansible-tmp-1726853626.2219932-28914-116232850429448 `" ) && sleep 0' 28023 1726853626.23773: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853626.23800: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853626.23824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853626.23847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853626.23877: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853626.24038: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853626.24042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853626.24093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853626.24164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853626.26213: stdout chunk (state=3): >>>ansible-tmp-1726853626.2219932-28914-116232850429448=/root/.ansible/tmp/ansible-tmp-1726853626.2219932-28914-116232850429448 <<< 28023 1726853626.26682: stdout chunk (state=3): >>><<< 28023 1726853626.26686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853626.26691: stderr chunk (state=3): >>><<< 28023 1726853626.26713: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853626.2219932-28914-116232850429448=/root/.ansible/tmp/ansible-tmp-1726853626.2219932-28914-116232850429448 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853626.26906: variable 'ansible_module_compression' from source: unknown 28023 1726853626.27126: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 28023 1726853626.27307: ANSIBALLZ: Acquiring lock 28023 1726853626.27354: ANSIBALLZ: Lock acquired: 139729396045568 28023 1726853626.27408: ANSIBALLZ: Creating module 28023 1726853626.74732: ANSIBALLZ: Writing module into payload 28023 1726853626.75298: ANSIBALLZ: Writing module 28023 1726853626.75321: ANSIBALLZ: Renaming module 28023 1726853626.75327: ANSIBALLZ: Done creating module 28023 1726853626.75577: variable 'ansible_facts' from source: unknown 28023 1726853626.75778: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853626.2219932-28914-116232850429448/AnsiballZ_network_connections.py 28023 1726853626.75978: Sending initial data 28023 1726853626.75981: Sent initial data (168 bytes) 28023 1726853626.77379: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853626.77488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853626.77534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853626.77655: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853626.77676: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853626.77894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853626.79593: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853626.79651: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853626.79715: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmplu1hbi34 /root/.ansible/tmp/ansible-tmp-1726853626.2219932-28914-116232850429448/AnsiballZ_network_connections.py <<< 28023 1726853626.79719: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853626.2219932-28914-116232850429448/AnsiballZ_network_connections.py" <<< 28023 1726853626.79846: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmplu1hbi34" to remote "/root/.ansible/tmp/ansible-tmp-1726853626.2219932-28914-116232850429448/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853626.2219932-28914-116232850429448/AnsiballZ_network_connections.py" <<< 28023 1726853626.82501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853626.82524: stderr chunk (state=3): >>><<< 28023 1726853626.82527: stdout chunk (state=3): >>><<< 28023 1726853626.82550: done transferring module to remote 28023 1726853626.82564: _low_level_execute_command(): starting 28023 1726853626.82569: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853626.2219932-28914-116232850429448/ /root/.ansible/tmp/ansible-tmp-1726853626.2219932-28914-116232850429448/AnsiballZ_network_connections.py && sleep 0' 28023 1726853626.83615: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853626.83787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853626.83798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853626.83813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853626.83824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853626.83831: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853626.83842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853626.83857: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28023 1726853626.83867: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 28023 1726853626.83876: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28023 1726853626.83885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853626.83894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853626.83906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853626.83917: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853626.83924: stderr chunk (state=3): >>>debug2: match found <<< 28023 1726853626.83926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853626.84031: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853626.84276: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853626.84358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853626.86349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853626.86353: stdout chunk (state=3): >>><<< 28023 1726853626.86358: stderr chunk (state=3): >>><<< 28023 1726853626.86382: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853626.86385: _low_level_execute_command(): starting 28023 1726853626.86390: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853626.2219932-28914-116232850429448/AnsiballZ_network_connections.py && sleep 0' 28023 1726853626.87574: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853626.87580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853626.87592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853626.87606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853626.87620: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853626.87626: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853626.87634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853626.87648: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28023 1726853626.87656: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 28023 1726853626.87666: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28023 1726853626.87676: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853626.87685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853626.87697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853626.87704: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853626.87711: stderr chunk (state=3): >>>debug2: match found <<< 28023 1726853626.87721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853626.88003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853626.88103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853627.51930: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, dac9b760-b2ba-4f99-bc3f-cd7e791a7d43\n[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 3c3e532f-c676-4575-81e5-c6f885a09e34\n[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, dac9b760-b2ba-4f99-bc3f-cd7e791a7d43 (not-active)\n[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 3c3e532f-c676-4575-81e5-c6f885a09e34 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28023 1726853627.54143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853627.54147: stderr chunk (state=3): >>><<< 28023 1726853627.54149: stdout chunk (state=3): >>><<< 28023 1726853627.54252: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, dac9b760-b2ba-4f99-bc3f-cd7e791a7d43\n[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 3c3e532f-c676-4575-81e5-c6f885a09e34\n[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, dac9b760-b2ba-4f99-bc3f-cd7e791a7d43 (not-active)\n[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 3c3e532f-c676-4575-81e5-c6f885a09e34 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853627.54304: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'state': 'up', 'type': 'ethernet', 'autoconnect': False, 'ip': {'address': ['198.51.100.3/24', '2001:db8::2/32'], 'route': [{'network': '198.51.10.64', 'prefix': 26, 'gateway': '198.51.100.6', 'metric': 4}, {'network': '2001:db6::4', 'prefix': 128, 'gateway': '2001:db8::1', 'metric': 2}]}}, {'name': 'ethtest1', 'interface_name': 'ethtest1', 'state': 'up', 'type': 'ethernet', 'autoconnect': False, 'ip': {'address': ['198.51.100.6/24', '2001:db8::4/32'], 'route': [{'network': '198.51.12.128', 'prefix': 26, 'gateway': '198.51.100.1', 'metric': 2}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853626.2219932-28914-116232850429448/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853627.54312: _low_level_execute_command(): starting 28023 1726853627.54317: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853626.2219932-28914-116232850429448/ > /dev/null 2>&1 && sleep 0' 28023 1726853627.55820: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853627.55828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853627.55922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853627.55925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853627.56130: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853627.56134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853627.56136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853627.56277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853627.56435: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853627.58493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853627.58676: stderr chunk (state=3): >>><<< 28023 1726853627.58680: stdout chunk (state=3): >>><<< 28023 1726853627.58753: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853627.58757: handler run complete 28023 1726853627.58879: attempt loop complete, returning result 28023 1726853627.58887: _execute() done 28023 1726853627.58894: dumping result to json 28023 1726853627.59279: done dumping result, returning 28023 1726853627.59282: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-fdb6-dad7-00000000002b] 28023 1726853627.59285: sending task result for task 02083763-bbaf-fdb6-dad7-00000000002b 28023 1726853627.59379: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000002b 28023 1726853627.59382: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/24", "2001:db8::2/32" ], "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.10.64", "prefix": 26 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db6::4", "prefix": 128 } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" }, { "autoconnect": false, "interface_name": "ethtest1", "ip": { "address": [ "198.51.100.6/24", "2001:db8::4/32" ], "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.12.128", "prefix": 26 } ] }, "name": "ethtest1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, dac9b760-b2ba-4f99-bc3f-cd7e791a7d43 [006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 3c3e532f-c676-4575-81e5-c6f885a09e34 [007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, dac9b760-b2ba-4f99-bc3f-cd7e791a7d43 (not-active) [008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 3c3e532f-c676-4575-81e5-c6f885a09e34 (not-active) 28023 1726853627.59569: no more pending results, returning what we have 28023 1726853627.59935: results queue empty 28023 1726853627.59937: checking for any_errors_fatal 28023 1726853627.59947: done checking for any_errors_fatal 28023 1726853627.59948: checking for max_fail_percentage 28023 1726853627.59949: done checking for max_fail_percentage 28023 1726853627.59950: checking to see if all hosts have failed and the running result is not ok 28023 1726853627.59951: done checking to see if all hosts have failed 28023 1726853627.59952: getting the remaining hosts for this loop 28023 1726853627.59953: done getting the remaining hosts for this loop 28023 1726853627.59960: getting the next task for host managed_node3 28023 1726853627.59966: done getting next task for host managed_node3 28023 1726853627.59969: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28023 1726853627.59975: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853627.59994: getting variables 28023 1726853627.59996: in VariableManager get_vars() 28023 1726853627.60266: Calling all_inventory to load vars for managed_node3 28023 1726853627.60269: Calling groups_inventory to load vars for managed_node3 28023 1726853627.60274: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853627.60285: Calling all_plugins_play to load vars for managed_node3 28023 1726853627.60288: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853627.60290: Calling groups_plugins_play to load vars for managed_node3 28023 1726853627.63628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853627.67584: done with get_vars() 28023 1726853627.67614: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:33:47 -0400 (0:00:01.630) 0:00:19.762 ****** 28023 1726853627.67821: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 28023 1726853627.67823: Creating lock for fedora.linux_system_roles.network_state 28023 1726853627.69014: worker is 1 (out of 1 available) 28023 1726853627.69027: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 28023 1726853627.69380: done queuing things up, now waiting for results queue to drain 28023 1726853627.69382: waiting for pending results... 28023 1726853627.69795: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 28023 1726853627.69897: in run() - task 02083763-bbaf-fdb6-dad7-00000000002c 28023 1726853627.69902: variable 'ansible_search_path' from source: unknown 28023 1726853627.69909: variable 'ansible_search_path' from source: unknown 28023 1726853627.69947: calling self._execute() 28023 1726853627.70160: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853627.70164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853627.70239: variable 'omit' from source: magic vars 28023 1726853627.71076: variable 'ansible_distribution_major_version' from source: facts 28023 1726853627.71080: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853627.71401: variable 'network_state' from source: role '' defaults 28023 1726853627.71414: Evaluated conditional (network_state != {}): False 28023 1726853627.71418: when evaluation is False, skipping this task 28023 1726853627.71429: _execute() done 28023 1726853627.71432: dumping result to json 28023 1726853627.71435: done dumping result, returning 28023 1726853627.71438: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-fdb6-dad7-00000000002c] 28023 1726853627.71440: sending task result for task 02083763-bbaf-fdb6-dad7-00000000002c skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28023 1726853627.71811: no more pending results, returning what we have 28023 1726853627.71816: results queue empty 28023 1726853627.71817: checking for any_errors_fatal 28023 1726853627.71835: done checking for any_errors_fatal 28023 1726853627.71836: checking for max_fail_percentage 28023 1726853627.71839: done checking for max_fail_percentage 28023 1726853627.71840: checking to see if all hosts have failed and the running result is not ok 28023 1726853627.71841: done checking to see if all hosts have failed 28023 1726853627.71842: getting the remaining hosts for this loop 28023 1726853627.71844: done getting the remaining hosts for this loop 28023 1726853627.71848: getting the next task for host managed_node3 28023 1726853627.71860: done getting next task for host managed_node3 28023 1726853627.71864: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28023 1726853627.71868: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853627.71890: getting variables 28023 1726853627.71892: in VariableManager get_vars() 28023 1726853627.71939: Calling all_inventory to load vars for managed_node3 28023 1726853627.71942: Calling groups_inventory to load vars for managed_node3 28023 1726853627.71945: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853627.71962: Calling all_plugins_play to load vars for managed_node3 28023 1726853627.71966: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853627.71969: Calling groups_plugins_play to load vars for managed_node3 28023 1726853627.72948: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000002c 28023 1726853627.72953: WORKER PROCESS EXITING 28023 1726853627.75260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853627.78747: done with get_vars() 28023 1726853627.78782: done getting variables 28023 1726853627.78841: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:33:47 -0400 (0:00:00.111) 0:00:19.873 ****** 28023 1726853627.78993: entering _queue_task() for managed_node3/debug 28023 1726853627.79840: worker is 1 (out of 1 available) 28023 1726853627.79852: exiting _queue_task() for managed_node3/debug 28023 1726853627.79868: done queuing things up, now waiting for results queue to drain 28023 1726853627.79869: waiting for pending results... 28023 1726853627.80266: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28023 1726853627.80622: in run() - task 02083763-bbaf-fdb6-dad7-00000000002d 28023 1726853627.80639: variable 'ansible_search_path' from source: unknown 28023 1726853627.80645: variable 'ansible_search_path' from source: unknown 28023 1726853627.80684: calling self._execute() 28023 1726853627.80951: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853627.80957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853627.80969: variable 'omit' from source: magic vars 28023 1726853627.81923: variable 'ansible_distribution_major_version' from source: facts 28023 1726853627.81925: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853627.81927: variable 'omit' from source: magic vars 28023 1726853627.82157: variable 'omit' from source: magic vars 28023 1726853627.82160: variable 'omit' from source: magic vars 28023 1726853627.82163: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853627.82393: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853627.82415: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853627.82433: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853627.82444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853627.82478: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853627.82482: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853627.82494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853627.82588: Set connection var ansible_shell_type to sh 28023 1726853627.82714: Set connection var ansible_shell_executable to /bin/sh 28023 1726853627.82720: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853627.82726: Set connection var ansible_connection to ssh 28023 1726853627.82732: Set connection var ansible_pipelining to False 28023 1726853627.82738: Set connection var ansible_timeout to 10 28023 1726853627.82773: variable 'ansible_shell_executable' from source: unknown 28023 1726853627.82776: variable 'ansible_connection' from source: unknown 28023 1726853627.82779: variable 'ansible_module_compression' from source: unknown 28023 1726853627.82881: variable 'ansible_shell_type' from source: unknown 28023 1726853627.82885: variable 'ansible_shell_executable' from source: unknown 28023 1726853627.82888: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853627.82890: variable 'ansible_pipelining' from source: unknown 28023 1726853627.82895: variable 'ansible_timeout' from source: unknown 28023 1726853627.82899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853627.83154: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853627.83167: variable 'omit' from source: magic vars 28023 1726853627.83274: starting attempt loop 28023 1726853627.83277: running the handler 28023 1726853627.83504: variable '__network_connections_result' from source: set_fact 28023 1726853627.83641: handler run complete 28023 1726853627.83700: attempt loop complete, returning result 28023 1726853627.83703: _execute() done 28023 1726853627.83705: dumping result to json 28023 1726853627.83708: done dumping result, returning 28023 1726853627.83728: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-fdb6-dad7-00000000002d] 28023 1726853627.83731: sending task result for task 02083763-bbaf-fdb6-dad7-00000000002d ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, dac9b760-b2ba-4f99-bc3f-cd7e791a7d43", "[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 3c3e532f-c676-4575-81e5-c6f885a09e34", "[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, dac9b760-b2ba-4f99-bc3f-cd7e791a7d43 (not-active)", "[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 3c3e532f-c676-4575-81e5-c6f885a09e34 (not-active)" ] } 28023 1726853627.84260: no more pending results, returning what we have 28023 1726853627.84264: results queue empty 28023 1726853627.84265: checking for any_errors_fatal 28023 1726853627.84270: done checking for any_errors_fatal 28023 1726853627.84273: checking for max_fail_percentage 28023 1726853627.84275: done checking for max_fail_percentage 28023 1726853627.84276: checking to see if all hosts have failed and the running result is not ok 28023 1726853627.84277: done checking to see if all hosts have failed 28023 1726853627.84278: getting the remaining hosts for this loop 28023 1726853627.84279: done getting the remaining hosts for this loop 28023 1726853627.84283: getting the next task for host managed_node3 28023 1726853627.84288: done getting next task for host managed_node3 28023 1726853627.84291: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28023 1726853627.84294: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853627.84302: getting variables 28023 1726853627.84303: in VariableManager get_vars() 28023 1726853627.84338: Calling all_inventory to load vars for managed_node3 28023 1726853627.84340: Calling groups_inventory to load vars for managed_node3 28023 1726853627.84342: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853627.84350: Calling all_plugins_play to load vars for managed_node3 28023 1726853627.84352: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853627.84355: Calling groups_plugins_play to load vars for managed_node3 28023 1726853627.85028: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000002d 28023 1726853627.85032: WORKER PROCESS EXITING 28023 1726853627.87548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853627.90960: done with get_vars() 28023 1726853627.90990: done getting variables 28023 1726853627.91050: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:33:47 -0400 (0:00:00.122) 0:00:19.995 ****** 28023 1726853627.91210: entering _queue_task() for managed_node3/debug 28023 1726853627.92225: worker is 1 (out of 1 available) 28023 1726853627.92238: exiting _queue_task() for managed_node3/debug 28023 1726853627.92250: done queuing things up, now waiting for results queue to drain 28023 1726853627.92252: waiting for pending results... 28023 1726853627.92692: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28023 1726853627.92966: in run() - task 02083763-bbaf-fdb6-dad7-00000000002e 28023 1726853627.93048: variable 'ansible_search_path' from source: unknown 28023 1726853627.93059: variable 'ansible_search_path' from source: unknown 28023 1726853627.93133: calling self._execute() 28023 1726853627.93467: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853627.93472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853627.93474: variable 'omit' from source: magic vars 28023 1726853627.94340: variable 'ansible_distribution_major_version' from source: facts 28023 1726853627.94343: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853627.94346: variable 'omit' from source: magic vars 28023 1726853627.94491: variable 'omit' from source: magic vars 28023 1726853627.94600: variable 'omit' from source: magic vars 28023 1726853627.94720: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853627.94777: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853627.94851: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853627.94903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853627.94920: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853627.95044: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853627.95047: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853627.95050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853627.95245: Set connection var ansible_shell_type to sh 28023 1726853627.95372: Set connection var ansible_shell_executable to /bin/sh 28023 1726853627.95375: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853627.95378: Set connection var ansible_connection to ssh 28023 1726853627.95380: Set connection var ansible_pipelining to False 28023 1726853627.95382: Set connection var ansible_timeout to 10 28023 1726853627.95477: variable 'ansible_shell_executable' from source: unknown 28023 1726853627.95481: variable 'ansible_connection' from source: unknown 28023 1726853627.95484: variable 'ansible_module_compression' from source: unknown 28023 1726853627.95486: variable 'ansible_shell_type' from source: unknown 28023 1726853627.95488: variable 'ansible_shell_executable' from source: unknown 28023 1726853627.95490: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853627.95492: variable 'ansible_pipelining' from source: unknown 28023 1726853627.95494: variable 'ansible_timeout' from source: unknown 28023 1726853627.95496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853627.95979: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853627.95983: variable 'omit' from source: magic vars 28023 1726853627.95985: starting attempt loop 28023 1726853627.95987: running the handler 28023 1726853627.96015: variable '__network_connections_result' from source: set_fact 28023 1726853627.96304: variable '__network_connections_result' from source: set_fact 28023 1726853627.96688: handler run complete 28023 1726853627.96892: attempt loop complete, returning result 28023 1726853627.96895: _execute() done 28023 1726853627.96897: dumping result to json 28023 1726853627.96899: done dumping result, returning 28023 1726853627.96901: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-fdb6-dad7-00000000002e] 28023 1726853627.96903: sending task result for task 02083763-bbaf-fdb6-dad7-00000000002e 28023 1726853627.97322: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000002e 28023 1726853627.97325: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/24", "2001:db8::2/32" ], "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.10.64", "prefix": 26 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db6::4", "prefix": 128 } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" }, { "autoconnect": false, "interface_name": "ethtest1", "ip": { "address": [ "198.51.100.6/24", "2001:db8::4/32" ], "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.12.128", "prefix": 26 } ] }, "name": "ethtest1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, dac9b760-b2ba-4f99-bc3f-cd7e791a7d43\n[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 3c3e532f-c676-4575-81e5-c6f885a09e34\n[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, dac9b760-b2ba-4f99-bc3f-cd7e791a7d43 (not-active)\n[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 3c3e532f-c676-4575-81e5-c6f885a09e34 (not-active)\n", "stderr_lines": [ "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, dac9b760-b2ba-4f99-bc3f-cd7e791a7d43", "[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 3c3e532f-c676-4575-81e5-c6f885a09e34", "[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, dac9b760-b2ba-4f99-bc3f-cd7e791a7d43 (not-active)", "[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 3c3e532f-c676-4575-81e5-c6f885a09e34 (not-active)" ] } } 28023 1726853627.97462: no more pending results, returning what we have 28023 1726853627.97467: results queue empty 28023 1726853627.97468: checking for any_errors_fatal 28023 1726853627.97479: done checking for any_errors_fatal 28023 1726853627.97480: checking for max_fail_percentage 28023 1726853627.97482: done checking for max_fail_percentage 28023 1726853627.97483: checking to see if all hosts have failed and the running result is not ok 28023 1726853627.97484: done checking to see if all hosts have failed 28023 1726853627.97485: getting the remaining hosts for this loop 28023 1726853627.97487: done getting the remaining hosts for this loop 28023 1726853627.97777: getting the next task for host managed_node3 28023 1726853627.97783: done getting next task for host managed_node3 28023 1726853627.97787: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28023 1726853627.97791: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853627.97802: getting variables 28023 1726853627.97803: in VariableManager get_vars() 28023 1726853627.97848: Calling all_inventory to load vars for managed_node3 28023 1726853627.97851: Calling groups_inventory to load vars for managed_node3 28023 1726853627.97853: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853627.97865: Calling all_plugins_play to load vars for managed_node3 28023 1726853627.97868: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853627.97873: Calling groups_plugins_play to load vars for managed_node3 28023 1726853628.00853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853628.04904: done with get_vars() 28023 1726853628.04934: done getting variables 28023 1726853628.05124: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:33:48 -0400 (0:00:00.139) 0:00:20.135 ****** 28023 1726853628.05162: entering _queue_task() for managed_node3/debug 28023 1726853628.06143: worker is 1 (out of 1 available) 28023 1726853628.06152: exiting _queue_task() for managed_node3/debug 28023 1726853628.06167: done queuing things up, now waiting for results queue to drain 28023 1726853628.06169: waiting for pending results... 28023 1726853628.06312: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28023 1726853628.06443: in run() - task 02083763-bbaf-fdb6-dad7-00000000002f 28023 1726853628.06472: variable 'ansible_search_path' from source: unknown 28023 1726853628.06481: variable 'ansible_search_path' from source: unknown 28023 1726853628.06533: calling self._execute() 28023 1726853628.06644: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853628.06658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853628.06674: variable 'omit' from source: magic vars 28023 1726853628.07068: variable 'ansible_distribution_major_version' from source: facts 28023 1726853628.07087: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853628.07215: variable 'network_state' from source: role '' defaults 28023 1726853628.07229: Evaluated conditional (network_state != {}): False 28023 1726853628.07236: when evaluation is False, skipping this task 28023 1726853628.07256: _execute() done 28023 1726853628.07376: dumping result to json 28023 1726853628.07387: done dumping result, returning 28023 1726853628.07391: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-fdb6-dad7-00000000002f] 28023 1726853628.07393: sending task result for task 02083763-bbaf-fdb6-dad7-00000000002f 28023 1726853628.07464: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000002f 28023 1726853628.07468: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 28023 1726853628.07535: no more pending results, returning what we have 28023 1726853628.07539: results queue empty 28023 1726853628.07540: checking for any_errors_fatal 28023 1726853628.07555: done checking for any_errors_fatal 28023 1726853628.07556: checking for max_fail_percentage 28023 1726853628.07561: done checking for max_fail_percentage 28023 1726853628.07562: checking to see if all hosts have failed and the running result is not ok 28023 1726853628.07563: done checking to see if all hosts have failed 28023 1726853628.07564: getting the remaining hosts for this loop 28023 1726853628.07566: done getting the remaining hosts for this loop 28023 1726853628.07570: getting the next task for host managed_node3 28023 1726853628.07578: done getting next task for host managed_node3 28023 1726853628.07583: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28023 1726853628.07586: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853628.07601: getting variables 28023 1726853628.07603: in VariableManager get_vars() 28023 1726853628.07646: Calling all_inventory to load vars for managed_node3 28023 1726853628.07649: Calling groups_inventory to load vars for managed_node3 28023 1726853628.07652: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853628.07669: Calling all_plugins_play to load vars for managed_node3 28023 1726853628.07688: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853628.07692: Calling groups_plugins_play to load vars for managed_node3 28023 1726853628.11366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853628.15193: done with get_vars() 28023 1726853628.15227: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:33:48 -0400 (0:00:00.102) 0:00:20.238 ****** 28023 1726853628.15447: entering _queue_task() for managed_node3/ping 28023 1726853628.15449: Creating lock for ping 28023 1726853628.16498: worker is 1 (out of 1 available) 28023 1726853628.16510: exiting _queue_task() for managed_node3/ping 28023 1726853628.16523: done queuing things up, now waiting for results queue to drain 28023 1726853628.16524: waiting for pending results... 28023 1726853628.17245: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 28023 1726853628.17864: in run() - task 02083763-bbaf-fdb6-dad7-000000000030 28023 1726853628.17869: variable 'ansible_search_path' from source: unknown 28023 1726853628.17873: variable 'ansible_search_path' from source: unknown 28023 1726853628.17973: calling self._execute() 28023 1726853628.18191: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853628.18281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853628.18285: variable 'omit' from source: magic vars 28023 1726853628.18955: variable 'ansible_distribution_major_version' from source: facts 28023 1726853628.18969: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853628.19079: variable 'omit' from source: magic vars 28023 1726853628.19104: variable 'omit' from source: magic vars 28023 1726853628.19140: variable 'omit' from source: magic vars 28023 1726853628.19212: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853628.19295: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853628.19299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853628.19302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853628.19304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853628.19329: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853628.19332: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853628.19334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853628.19549: Set connection var ansible_shell_type to sh 28023 1726853628.19559: Set connection var ansible_shell_executable to /bin/sh 28023 1726853628.19562: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853628.19567: Set connection var ansible_connection to ssh 28023 1726853628.19597: Set connection var ansible_pipelining to False 28023 1726853628.19604: Set connection var ansible_timeout to 10 28023 1726853628.19630: variable 'ansible_shell_executable' from source: unknown 28023 1726853628.19634: variable 'ansible_connection' from source: unknown 28023 1726853628.19637: variable 'ansible_module_compression' from source: unknown 28023 1726853628.19639: variable 'ansible_shell_type' from source: unknown 28023 1726853628.19642: variable 'ansible_shell_executable' from source: unknown 28023 1726853628.19644: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853628.19646: variable 'ansible_pipelining' from source: unknown 28023 1726853628.19648: variable 'ansible_timeout' from source: unknown 28023 1726853628.19692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853628.19895: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28023 1726853628.19909: variable 'omit' from source: magic vars 28023 1726853628.19914: starting attempt loop 28023 1726853628.19946: running the handler 28023 1726853628.19950: _low_level_execute_command(): starting 28023 1726853628.19953: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853628.21075: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853628.21165: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853628.21229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853628.22920: stdout chunk (state=3): >>>/root <<< 28023 1726853628.23176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853628.23180: stdout chunk (state=3): >>><<< 28023 1726853628.23183: stderr chunk (state=3): >>><<< 28023 1726853628.23187: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853628.23190: _low_level_execute_command(): starting 28023 1726853628.23193: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853628.2309558-28987-111894162288971 `" && echo ansible-tmp-1726853628.2309558-28987-111894162288971="` echo /root/.ansible/tmp/ansible-tmp-1726853628.2309558-28987-111894162288971 `" ) && sleep 0' 28023 1726853628.23697: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853628.23706: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853628.23716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853628.23732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853628.23776: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853628.23779: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853628.23782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853628.23855: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853628.23883: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853628.23990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853628.26084: stdout chunk (state=3): >>>ansible-tmp-1726853628.2309558-28987-111894162288971=/root/.ansible/tmp/ansible-tmp-1726853628.2309558-28987-111894162288971 <<< 28023 1726853628.26175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853628.26179: stdout chunk (state=3): >>><<< 28023 1726853628.26181: stderr chunk (state=3): >>><<< 28023 1726853628.26199: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853628.2309558-28987-111894162288971=/root/.ansible/tmp/ansible-tmp-1726853628.2309558-28987-111894162288971 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853628.26316: variable 'ansible_module_compression' from source: unknown 28023 1726853628.26383: ANSIBALLZ: Using lock for ping 28023 1726853628.26482: ANSIBALLZ: Acquiring lock 28023 1726853628.26486: ANSIBALLZ: Lock acquired: 139729390906320 28023 1726853628.26488: ANSIBALLZ: Creating module 28023 1726853628.37184: ANSIBALLZ: Writing module into payload 28023 1726853628.37224: ANSIBALLZ: Writing module 28023 1726853628.37240: ANSIBALLZ: Renaming module 28023 1726853628.37247: ANSIBALLZ: Done creating module 28023 1726853628.37266: variable 'ansible_facts' from source: unknown 28023 1726853628.37312: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853628.2309558-28987-111894162288971/AnsiballZ_ping.py 28023 1726853628.37410: Sending initial data 28023 1726853628.37414: Sent initial data (153 bytes) 28023 1726853628.37828: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853628.37865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853628.37868: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853628.37870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853628.37875: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853628.37877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853628.37879: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853628.37935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853628.37938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853628.37939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853628.38001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853628.39739: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853628.39881: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853628.39939: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpsqkfaqzv /root/.ansible/tmp/ansible-tmp-1726853628.2309558-28987-111894162288971/AnsiballZ_ping.py <<< 28023 1726853628.39950: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853628.2309558-28987-111894162288971/AnsiballZ_ping.py" <<< 28023 1726853628.40043: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 28023 1726853628.40103: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpsqkfaqzv" to remote "/root/.ansible/tmp/ansible-tmp-1726853628.2309558-28987-111894162288971/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853628.2309558-28987-111894162288971/AnsiballZ_ping.py" <<< 28023 1726853628.40754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853628.40816: stderr chunk (state=3): >>><<< 28023 1726853628.40828: stdout chunk (state=3): >>><<< 28023 1726853628.40970: done transferring module to remote 28023 1726853628.40975: _low_level_execute_command(): starting 28023 1726853628.40977: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853628.2309558-28987-111894162288971/ /root/.ansible/tmp/ansible-tmp-1726853628.2309558-28987-111894162288971/AnsiballZ_ping.py && sleep 0' 28023 1726853628.41467: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853628.41500: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853628.41512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853628.41583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853628.43467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853628.43470: stderr chunk (state=3): >>><<< 28023 1726853628.43478: stdout chunk (state=3): >>><<< 28023 1726853628.43501: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853628.43504: _low_level_execute_command(): starting 28023 1726853628.43515: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853628.2309558-28987-111894162288971/AnsiballZ_ping.py && sleep 0' 28023 1726853628.44194: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853628.44212: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853628.44222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853628.44237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853628.44263: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853628.44266: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853628.44268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853628.44288: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28023 1726853628.44296: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 28023 1726853628.44319: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28023 1726853628.44322: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853628.44400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853628.44404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853628.44406: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853628.44408: stderr chunk (state=3): >>>debug2: match found <<< 28023 1726853628.44412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853628.44466: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853628.44470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853628.44558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853628.59866: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28023 1726853628.61236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853628.61286: stderr chunk (state=3): >>><<< 28023 1726853628.61290: stdout chunk (state=3): >>><<< 28023 1726853628.61304: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853628.61324: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853628.2309558-28987-111894162288971/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853628.61332: _low_level_execute_command(): starting 28023 1726853628.61336: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853628.2309558-28987-111894162288971/ > /dev/null 2>&1 && sleep 0' 28023 1726853628.61862: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853628.61866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853628.61880: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853628.61892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853628.61939: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853628.61950: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853628.62016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853628.63929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853628.63966: stderr chunk (state=3): >>><<< 28023 1726853628.63970: stdout chunk (state=3): >>><<< 28023 1726853628.63986: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853628.63992: handler run complete 28023 1726853628.64006: attempt loop complete, returning result 28023 1726853628.64009: _execute() done 28023 1726853628.64011: dumping result to json 28023 1726853628.64015: done dumping result, returning 28023 1726853628.64027: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-fdb6-dad7-000000000030] 28023 1726853628.64030: sending task result for task 02083763-bbaf-fdb6-dad7-000000000030 28023 1726853628.64140: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000030 28023 1726853628.64142: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 28023 1726853628.64202: no more pending results, returning what we have 28023 1726853628.64205: results queue empty 28023 1726853628.64205: checking for any_errors_fatal 28023 1726853628.64213: done checking for any_errors_fatal 28023 1726853628.64214: checking for max_fail_percentage 28023 1726853628.64216: done checking for max_fail_percentage 28023 1726853628.64216: checking to see if all hosts have failed and the running result is not ok 28023 1726853628.64217: done checking to see if all hosts have failed 28023 1726853628.64218: getting the remaining hosts for this loop 28023 1726853628.64220: done getting the remaining hosts for this loop 28023 1726853628.64225: getting the next task for host managed_node3 28023 1726853628.64235: done getting next task for host managed_node3 28023 1726853628.64237: ^ task is: TASK: meta (role_complete) 28023 1726853628.64240: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853628.64253: getting variables 28023 1726853628.64254: in VariableManager get_vars() 28023 1726853628.64342: Calling all_inventory to load vars for managed_node3 28023 1726853628.64345: Calling groups_inventory to load vars for managed_node3 28023 1726853628.64362: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853628.64379: Calling all_plugins_play to load vars for managed_node3 28023 1726853628.64382: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853628.64386: Calling groups_plugins_play to load vars for managed_node3 28023 1726853628.65565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853628.66442: done with get_vars() 28023 1726853628.66463: done getting variables 28023 1726853628.66523: done queuing things up, now waiting for results queue to drain 28023 1726853628.66524: results queue empty 28023 1726853628.66525: checking for any_errors_fatal 28023 1726853628.66527: done checking for any_errors_fatal 28023 1726853628.66527: checking for max_fail_percentage 28023 1726853628.66528: done checking for max_fail_percentage 28023 1726853628.66529: checking to see if all hosts have failed and the running result is not ok 28023 1726853628.66529: done checking to see if all hosts have failed 28023 1726853628.66530: getting the remaining hosts for this loop 28023 1726853628.66530: done getting the remaining hosts for this loop 28023 1726853628.66532: getting the next task for host managed_node3 28023 1726853628.66535: done getting next task for host managed_node3 28023 1726853628.66537: ^ task is: TASK: Get the IPv4 routes from the route table main 28023 1726853628.66537: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853628.66539: getting variables 28023 1726853628.66540: in VariableManager get_vars() 28023 1726853628.66549: Calling all_inventory to load vars for managed_node3 28023 1726853628.66550: Calling groups_inventory to load vars for managed_node3 28023 1726853628.66552: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853628.66555: Calling all_plugins_play to load vars for managed_node3 28023 1726853628.66557: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853628.66559: Calling groups_plugins_play to load vars for managed_node3 28023 1726853628.67447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853628.68516: done with get_vars() 28023 1726853628.68532: done getting variables 28023 1726853628.68567: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the IPv4 routes from the route table main] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:73 Friday 20 September 2024 13:33:48 -0400 (0:00:00.531) 0:00:20.769 ****** 28023 1726853628.68599: entering _queue_task() for managed_node3/command 28023 1726853628.68920: worker is 1 (out of 1 available) 28023 1726853628.68933: exiting _queue_task() for managed_node3/command 28023 1726853628.68947: done queuing things up, now waiting for results queue to drain 28023 1726853628.68949: waiting for pending results... 28023 1726853628.69222: running TaskExecutor() for managed_node3/TASK: Get the IPv4 routes from the route table main 28023 1726853628.69305: in run() - task 02083763-bbaf-fdb6-dad7-000000000060 28023 1726853628.69317: variable 'ansible_search_path' from source: unknown 28023 1726853628.69351: calling self._execute() 28023 1726853628.69432: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853628.69437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853628.69445: variable 'omit' from source: magic vars 28023 1726853628.69787: variable 'ansible_distribution_major_version' from source: facts 28023 1726853628.69798: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853628.69818: variable 'omit' from source: magic vars 28023 1726853628.69841: variable 'omit' from source: magic vars 28023 1726853628.69868: variable 'omit' from source: magic vars 28023 1726853628.69907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853628.69935: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853628.69978: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853628.69984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853628.69997: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853628.70021: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853628.70024: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853628.70026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853628.70113: Set connection var ansible_shell_type to sh 28023 1726853628.70117: Set connection var ansible_shell_executable to /bin/sh 28023 1726853628.70126: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853628.70129: Set connection var ansible_connection to ssh 28023 1726853628.70132: Set connection var ansible_pipelining to False 28023 1726853628.70139: Set connection var ansible_timeout to 10 28023 1726853628.70157: variable 'ansible_shell_executable' from source: unknown 28023 1726853628.70162: variable 'ansible_connection' from source: unknown 28023 1726853628.70165: variable 'ansible_module_compression' from source: unknown 28023 1726853628.70168: variable 'ansible_shell_type' from source: unknown 28023 1726853628.70172: variable 'ansible_shell_executable' from source: unknown 28023 1726853628.70174: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853628.70178: variable 'ansible_pipelining' from source: unknown 28023 1726853628.70181: variable 'ansible_timeout' from source: unknown 28023 1726853628.70185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853628.70308: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853628.70317: variable 'omit' from source: magic vars 28023 1726853628.70327: starting attempt loop 28023 1726853628.70330: running the handler 28023 1726853628.70339: _low_level_execute_command(): starting 28023 1726853628.70347: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853628.71086: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853628.71140: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853628.71144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853628.71198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853628.72883: stdout chunk (state=3): >>>/root <<< 28023 1726853628.72984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853628.73014: stderr chunk (state=3): >>><<< 28023 1726853628.73018: stdout chunk (state=3): >>><<< 28023 1726853628.73038: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853628.73051: _low_level_execute_command(): starting 28023 1726853628.73059: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853628.7303958-29008-122790012912474 `" && echo ansible-tmp-1726853628.7303958-29008-122790012912474="` echo /root/.ansible/tmp/ansible-tmp-1726853628.7303958-29008-122790012912474 `" ) && sleep 0' 28023 1726853628.73550: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853628.73553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853628.73587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853628.73591: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853628.73601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853628.73642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853628.73646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853628.73656: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853628.73726: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853628.75676: stdout chunk (state=3): >>>ansible-tmp-1726853628.7303958-29008-122790012912474=/root/.ansible/tmp/ansible-tmp-1726853628.7303958-29008-122790012912474 <<< 28023 1726853628.75798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853628.75801: stderr chunk (state=3): >>><<< 28023 1726853628.75804: stdout chunk (state=3): >>><<< 28023 1726853628.75821: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853628.7303958-29008-122790012912474=/root/.ansible/tmp/ansible-tmp-1726853628.7303958-29008-122790012912474 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853628.75849: variable 'ansible_module_compression' from source: unknown 28023 1726853628.75894: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28023 1726853628.75935: variable 'ansible_facts' from source: unknown 28023 1726853628.76000: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853628.7303958-29008-122790012912474/AnsiballZ_command.py 28023 1726853628.76106: Sending initial data 28023 1726853628.76110: Sent initial data (156 bytes) 28023 1726853628.76719: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853628.76779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853628.76783: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853628.76855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853628.78460: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28023 1726853628.78464: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853628.78512: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853628.78577: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpgzb7lrkx /root/.ansible/tmp/ansible-tmp-1726853628.7303958-29008-122790012912474/AnsiballZ_command.py <<< 28023 1726853628.78579: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853628.7303958-29008-122790012912474/AnsiballZ_command.py" <<< 28023 1726853628.78636: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpgzb7lrkx" to remote "/root/.ansible/tmp/ansible-tmp-1726853628.7303958-29008-122790012912474/AnsiballZ_command.py" <<< 28023 1726853628.78641: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853628.7303958-29008-122790012912474/AnsiballZ_command.py" <<< 28023 1726853628.79265: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853628.79325: stderr chunk (state=3): >>><<< 28023 1726853628.79328: stdout chunk (state=3): >>><<< 28023 1726853628.79384: done transferring module to remote 28023 1726853628.79387: _low_level_execute_command(): starting 28023 1726853628.79396: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853628.7303958-29008-122790012912474/ /root/.ansible/tmp/ansible-tmp-1726853628.7303958-29008-122790012912474/AnsiballZ_command.py && sleep 0' 28023 1726853628.80120: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853628.80206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853628.82027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853628.82031: stderr chunk (state=3): >>><<< 28023 1726853628.82033: stdout chunk (state=3): >>><<< 28023 1726853628.82049: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853628.82052: _low_level_execute_command(): starting 28023 1726853628.82061: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853628.7303958-29008-122790012912474/AnsiballZ_command.py && sleep 0' 28023 1726853628.82470: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853628.82508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853628.82511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853628.82514: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853628.82516: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853628.82518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853628.82566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853628.82570: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853628.82644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853628.98420: stdout chunk (state=3): >>> {"changed": true, "stdout": "default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.217 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.217 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \n198.51.10.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 \n198.51.12.128/26 via 198.51.100.1 dev ethtest1 proto static metric 2 \n198.51.100.0/24 dev ethtest0 proto kernel scope link src 198.51.100.3 metric 103 \n198.51.100.0/24 dev ethtest1 proto kernel scope link src 198.51.100.6 metric 104 ", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "route"], "start": "2024-09-20 13:33:48.978905", "end": "2024-09-20 13:33:48.982845", "delta": "0:00:00.003940", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28023 1726853629.00051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853629.00078: stderr chunk (state=3): >>><<< 28023 1726853629.00081: stdout chunk (state=3): >>><<< 28023 1726853629.00098: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.217 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.217 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \n198.51.10.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 \n198.51.12.128/26 via 198.51.100.1 dev ethtest1 proto static metric 2 \n198.51.100.0/24 dev ethtest0 proto kernel scope link src 198.51.100.3 metric 103 \n198.51.100.0/24 dev ethtest1 proto kernel scope link src 198.51.100.6 metric 104 ", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "route"], "start": "2024-09-20 13:33:48.978905", "end": "2024-09-20 13:33:48.982845", "delta": "0:00:00.003940", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853629.00131: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853628.7303958-29008-122790012912474/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853629.00138: _low_level_execute_command(): starting 28023 1726853629.00142: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853628.7303958-29008-122790012912474/ > /dev/null 2>&1 && sleep 0' 28023 1726853629.00563: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853629.00573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853629.00598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853629.00603: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853629.00605: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853629.00666: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853629.00669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853629.00676: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853629.00725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853629.02591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853629.02612: stderr chunk (state=3): >>><<< 28023 1726853629.02615: stdout chunk (state=3): >>><<< 28023 1726853629.02627: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853629.02633: handler run complete 28023 1726853629.02652: Evaluated conditional (False): False 28023 1726853629.02661: attempt loop complete, returning result 28023 1726853629.02664: _execute() done 28023 1726853629.02666: dumping result to json 28023 1726853629.02673: done dumping result, returning 28023 1726853629.02684: done running TaskExecutor() for managed_node3/TASK: Get the IPv4 routes from the route table main [02083763-bbaf-fdb6-dad7-000000000060] 28023 1726853629.02690: sending task result for task 02083763-bbaf-fdb6-dad7-000000000060 28023 1726853629.02788: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000060 28023 1726853629.02791: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "-4", "route" ], "delta": "0:00:00.003940", "end": "2024-09-20 13:33:48.982845", "rc": 0, "start": "2024-09-20 13:33:48.978905" } STDOUT: default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.217 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.217 metric 100 192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown 198.51.10.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 198.51.12.128/26 via 198.51.100.1 dev ethtest1 proto static metric 2 198.51.100.0/24 dev ethtest0 proto kernel scope link src 198.51.100.3 metric 103 198.51.100.0/24 dev ethtest1 proto kernel scope link src 198.51.100.6 metric 104 28023 1726853629.02867: no more pending results, returning what we have 28023 1726853629.02872: results queue empty 28023 1726853629.02873: checking for any_errors_fatal 28023 1726853629.02875: done checking for any_errors_fatal 28023 1726853629.02875: checking for max_fail_percentage 28023 1726853629.02877: done checking for max_fail_percentage 28023 1726853629.02878: checking to see if all hosts have failed and the running result is not ok 28023 1726853629.02879: done checking to see if all hosts have failed 28023 1726853629.02880: getting the remaining hosts for this loop 28023 1726853629.02882: done getting the remaining hosts for this loop 28023 1726853629.02886: getting the next task for host managed_node3 28023 1726853629.02891: done getting next task for host managed_node3 28023 1726853629.02893: ^ task is: TASK: Assert that the route table main contains the specified IPv4 routes 28023 1726853629.02895: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853629.02898: getting variables 28023 1726853629.02900: in VariableManager get_vars() 28023 1726853629.02941: Calling all_inventory to load vars for managed_node3 28023 1726853629.02944: Calling groups_inventory to load vars for managed_node3 28023 1726853629.02946: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853629.02959: Calling all_plugins_play to load vars for managed_node3 28023 1726853629.02962: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853629.02964: Calling groups_plugins_play to load vars for managed_node3 28023 1726853629.07055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853629.07904: done with get_vars() 28023 1726853629.07922: done getting variables 28023 1726853629.07957: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the route table main contains the specified IPv4 routes] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:78 Friday 20 September 2024 13:33:49 -0400 (0:00:00.393) 0:00:21.163 ****** 28023 1726853629.07977: entering _queue_task() for managed_node3/assert 28023 1726853629.08248: worker is 1 (out of 1 available) 28023 1726853629.08265: exiting _queue_task() for managed_node3/assert 28023 1726853629.08280: done queuing things up, now waiting for results queue to drain 28023 1726853629.08281: waiting for pending results... 28023 1726853629.08464: running TaskExecutor() for managed_node3/TASK: Assert that the route table main contains the specified IPv4 routes 28023 1726853629.08540: in run() - task 02083763-bbaf-fdb6-dad7-000000000061 28023 1726853629.08551: variable 'ansible_search_path' from source: unknown 28023 1726853629.08580: calling self._execute() 28023 1726853629.08661: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853629.08665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853629.08674: variable 'omit' from source: magic vars 28023 1726853629.08959: variable 'ansible_distribution_major_version' from source: facts 28023 1726853629.08968: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853629.08974: variable 'omit' from source: magic vars 28023 1726853629.08992: variable 'omit' from source: magic vars 28023 1726853629.09018: variable 'omit' from source: magic vars 28023 1726853629.09055: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853629.09086: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853629.09102: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853629.09115: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853629.09125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853629.09149: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853629.09152: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853629.09154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853629.09225: Set connection var ansible_shell_type to sh 28023 1726853629.09231: Set connection var ansible_shell_executable to /bin/sh 28023 1726853629.09237: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853629.09243: Set connection var ansible_connection to ssh 28023 1726853629.09248: Set connection var ansible_pipelining to False 28023 1726853629.09254: Set connection var ansible_timeout to 10 28023 1726853629.09279: variable 'ansible_shell_executable' from source: unknown 28023 1726853629.09282: variable 'ansible_connection' from source: unknown 28023 1726853629.09285: variable 'ansible_module_compression' from source: unknown 28023 1726853629.09287: variable 'ansible_shell_type' from source: unknown 28023 1726853629.09290: variable 'ansible_shell_executable' from source: unknown 28023 1726853629.09293: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853629.09295: variable 'ansible_pipelining' from source: unknown 28023 1726853629.09297: variable 'ansible_timeout' from source: unknown 28023 1726853629.09299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853629.09401: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853629.09417: variable 'omit' from source: magic vars 28023 1726853629.09420: starting attempt loop 28023 1726853629.09423: running the handler 28023 1726853629.09536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853629.09707: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853629.09759: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853629.09792: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853629.09820: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853629.09894: variable 'route_table_main_ipv4' from source: set_fact 28023 1726853629.09920: Evaluated conditional (route_table_main_ipv4.stdout is search("198.51.10.64/26 via 198.51.100.6 dev ethtest0\s+(proto static )?metric 4")): True 28023 1726853629.10015: variable 'route_table_main_ipv4' from source: set_fact 28023 1726853629.10038: Evaluated conditional (route_table_main_ipv4.stdout is search("198.51.12.128/26 via 198.51.100.1 dev ethtest1\s+(proto static )?metric 2")): True 28023 1726853629.10043: handler run complete 28023 1726853629.10055: attempt loop complete, returning result 28023 1726853629.10058: _execute() done 28023 1726853629.10064: dumping result to json 28023 1726853629.10067: done dumping result, returning 28023 1726853629.10077: done running TaskExecutor() for managed_node3/TASK: Assert that the route table main contains the specified IPv4 routes [02083763-bbaf-fdb6-dad7-000000000061] 28023 1726853629.10081: sending task result for task 02083763-bbaf-fdb6-dad7-000000000061 28023 1726853629.10164: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000061 28023 1726853629.10166: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 28023 1726853629.10218: no more pending results, returning what we have 28023 1726853629.10221: results queue empty 28023 1726853629.10222: checking for any_errors_fatal 28023 1726853629.10232: done checking for any_errors_fatal 28023 1726853629.10232: checking for max_fail_percentage 28023 1726853629.10235: done checking for max_fail_percentage 28023 1726853629.10236: checking to see if all hosts have failed and the running result is not ok 28023 1726853629.10237: done checking to see if all hosts have failed 28023 1726853629.10238: getting the remaining hosts for this loop 28023 1726853629.10239: done getting the remaining hosts for this loop 28023 1726853629.10242: getting the next task for host managed_node3 28023 1726853629.10248: done getting next task for host managed_node3 28023 1726853629.10250: ^ task is: TASK: Get the IPv6 routes from the route table main 28023 1726853629.10252: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853629.10255: getting variables 28023 1726853629.10258: in VariableManager get_vars() 28023 1726853629.10299: Calling all_inventory to load vars for managed_node3 28023 1726853629.10302: Calling groups_inventory to load vars for managed_node3 28023 1726853629.10304: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853629.10316: Calling all_plugins_play to load vars for managed_node3 28023 1726853629.10318: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853629.10321: Calling groups_plugins_play to load vars for managed_node3 28023 1726853629.11111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853629.11985: done with get_vars() 28023 1726853629.12000: done getting variables 28023 1726853629.12041: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the IPv6 routes from the route table main] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:89 Friday 20 September 2024 13:33:49 -0400 (0:00:00.040) 0:00:21.204 ****** 28023 1726853629.12068: entering _queue_task() for managed_node3/command 28023 1726853629.12302: worker is 1 (out of 1 available) 28023 1726853629.12314: exiting _queue_task() for managed_node3/command 28023 1726853629.12326: done queuing things up, now waiting for results queue to drain 28023 1726853629.12327: waiting for pending results... 28023 1726853629.12512: running TaskExecutor() for managed_node3/TASK: Get the IPv6 routes from the route table main 28023 1726853629.12583: in run() - task 02083763-bbaf-fdb6-dad7-000000000062 28023 1726853629.12595: variable 'ansible_search_path' from source: unknown 28023 1726853629.12624: calling self._execute() 28023 1726853629.12708: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853629.12712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853629.12721: variable 'omit' from source: magic vars 28023 1726853629.13002: variable 'ansible_distribution_major_version' from source: facts 28023 1726853629.13013: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853629.13018: variable 'omit' from source: magic vars 28023 1726853629.13036: variable 'omit' from source: magic vars 28023 1726853629.13062: variable 'omit' from source: magic vars 28023 1726853629.13095: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853629.13126: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853629.13143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853629.13159: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853629.13168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853629.13193: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853629.13197: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853629.13199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853629.13267: Set connection var ansible_shell_type to sh 28023 1726853629.13274: Set connection var ansible_shell_executable to /bin/sh 28023 1726853629.13280: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853629.13285: Set connection var ansible_connection to ssh 28023 1726853629.13291: Set connection var ansible_pipelining to False 28023 1726853629.13296: Set connection var ansible_timeout to 10 28023 1726853629.13317: variable 'ansible_shell_executable' from source: unknown 28023 1726853629.13322: variable 'ansible_connection' from source: unknown 28023 1726853629.13325: variable 'ansible_module_compression' from source: unknown 28023 1726853629.13328: variable 'ansible_shell_type' from source: unknown 28023 1726853629.13330: variable 'ansible_shell_executable' from source: unknown 28023 1726853629.13333: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853629.13335: variable 'ansible_pipelining' from source: unknown 28023 1726853629.13337: variable 'ansible_timeout' from source: unknown 28023 1726853629.13339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853629.13437: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853629.13445: variable 'omit' from source: magic vars 28023 1726853629.13455: starting attempt loop 28023 1726853629.13461: running the handler 28023 1726853629.13473: _low_level_execute_command(): starting 28023 1726853629.13479: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853629.13990: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853629.13993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853629.13999: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853629.14001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853629.14055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853629.14058: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853629.14061: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853629.14137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853629.15854: stdout chunk (state=3): >>>/root <<< 28023 1726853629.15952: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853629.15982: stderr chunk (state=3): >>><<< 28023 1726853629.15985: stdout chunk (state=3): >>><<< 28023 1726853629.16006: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853629.16017: _low_level_execute_command(): starting 28023 1726853629.16021: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853629.1600566-29024-250211799264932 `" && echo ansible-tmp-1726853629.1600566-29024-250211799264932="` echo /root/.ansible/tmp/ansible-tmp-1726853629.1600566-29024-250211799264932 `" ) && sleep 0' 28023 1726853629.16455: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853629.16461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853629.16464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 28023 1726853629.16474: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853629.16477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853629.16526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853629.16528: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853629.16588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853629.18553: stdout chunk (state=3): >>>ansible-tmp-1726853629.1600566-29024-250211799264932=/root/.ansible/tmp/ansible-tmp-1726853629.1600566-29024-250211799264932 <<< 28023 1726853629.18665: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853629.18690: stderr chunk (state=3): >>><<< 28023 1726853629.18693: stdout chunk (state=3): >>><<< 28023 1726853629.18709: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853629.1600566-29024-250211799264932=/root/.ansible/tmp/ansible-tmp-1726853629.1600566-29024-250211799264932 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853629.18732: variable 'ansible_module_compression' from source: unknown 28023 1726853629.18777: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28023 1726853629.18808: variable 'ansible_facts' from source: unknown 28023 1726853629.18866: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853629.1600566-29024-250211799264932/AnsiballZ_command.py 28023 1726853629.18962: Sending initial data 28023 1726853629.18966: Sent initial data (156 bytes) 28023 1726853629.19424: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853629.19428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853629.19431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 28023 1726853629.19434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853629.19436: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853629.19486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853629.19490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853629.19559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853629.21217: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853629.21441: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853629.21445: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmp3c9sg7hy /root/.ansible/tmp/ansible-tmp-1726853629.1600566-29024-250211799264932/AnsiballZ_command.py <<< 28023 1726853629.21448: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853629.1600566-29024-250211799264932/AnsiballZ_command.py" <<< 28023 1726853629.22029: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmp3c9sg7hy" to remote "/root/.ansible/tmp/ansible-tmp-1726853629.1600566-29024-250211799264932/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853629.1600566-29024-250211799264932/AnsiballZ_command.py" <<< 28023 1726853629.23945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853629.24066: stderr chunk (state=3): >>><<< 28023 1726853629.24069: stdout chunk (state=3): >>><<< 28023 1726853629.24073: done transferring module to remote 28023 1726853629.24076: _low_level_execute_command(): starting 28023 1726853629.24078: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853629.1600566-29024-250211799264932/ /root/.ansible/tmp/ansible-tmp-1726853629.1600566-29024-250211799264932/AnsiballZ_command.py && sleep 0' 28023 1726853629.25321: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853629.25374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853629.25398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853629.25411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853629.25481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853629.27591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853629.27595: stdout chunk (state=3): >>><<< 28023 1726853629.27597: stderr chunk (state=3): >>><<< 28023 1726853629.27779: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853629.27782: _low_level_execute_command(): starting 28023 1726853629.27785: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853629.1600566-29024-250211799264932/AnsiballZ_command.py && sleep 0' 28023 1726853629.28418: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853629.28442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853629.28497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853629.28539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853629.28600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853629.44466: stdout chunk (state=3): >>> {"changed": true, "stdout": "2001:db6::4 via 2001:db8::1 dev ethtest0 proto static metric 2 pref medium\n2001:db8::/32 dev ethtest0 proto kernel metric 103 pref medium\n2001:db8::/32 dev ethtest1 proto kernel metric 104 pref medium\nfe80::/64 dev peerethtest0 proto kernel metric 256 pref medium\nfe80::/64 dev peerethtest1 proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev ethtest0 proto kernel metric 1024 pref medium\nfe80::/64 dev ethtest1 proto kernel metric 1024 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 13:33:49.439677", "end": "2024-09-20 13:33:49.443494", "delta": "0:00:00.003817", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28023 1726853629.46301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853629.46305: stdout chunk (state=3): >>><<< 28023 1726853629.46307: stderr chunk (state=3): >>><<< 28023 1726853629.46406: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "2001:db6::4 via 2001:db8::1 dev ethtest0 proto static metric 2 pref medium\n2001:db8::/32 dev ethtest0 proto kernel metric 103 pref medium\n2001:db8::/32 dev ethtest1 proto kernel metric 104 pref medium\nfe80::/64 dev peerethtest0 proto kernel metric 256 pref medium\nfe80::/64 dev peerethtest1 proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev ethtest0 proto kernel metric 1024 pref medium\nfe80::/64 dev ethtest1 proto kernel metric 1024 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 13:33:49.439677", "end": "2024-09-20 13:33:49.443494", "delta": "0:00:00.003817", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853629.46410: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853629.1600566-29024-250211799264932/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853629.46414: _low_level_execute_command(): starting 28023 1726853629.46417: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853629.1600566-29024-250211799264932/ > /dev/null 2>&1 && sleep 0' 28023 1726853629.47379: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853629.47402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853629.47415: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853629.47479: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853629.47512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853629.47576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853629.49461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853629.49469: stderr chunk (state=3): >>><<< 28023 1726853629.49475: stdout chunk (state=3): >>><<< 28023 1726853629.49515: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853629.49519: handler run complete 28023 1726853629.49528: Evaluated conditional (False): False 28023 1726853629.49559: attempt loop complete, returning result 28023 1726853629.49564: _execute() done 28023 1726853629.49566: dumping result to json 28023 1726853629.49569: done dumping result, returning 28023 1726853629.49572: done running TaskExecutor() for managed_node3/TASK: Get the IPv6 routes from the route table main [02083763-bbaf-fdb6-dad7-000000000062] 28023 1726853629.49575: sending task result for task 02083763-bbaf-fdb6-dad7-000000000062 28023 1726853629.49679: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000062 28023 1726853629.49682: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.003817", "end": "2024-09-20 13:33:49.443494", "rc": 0, "start": "2024-09-20 13:33:49.439677" } STDOUT: 2001:db6::4 via 2001:db8::1 dev ethtest0 proto static metric 2 pref medium 2001:db8::/32 dev ethtest0 proto kernel metric 103 pref medium 2001:db8::/32 dev ethtest1 proto kernel metric 104 pref medium fe80::/64 dev peerethtest0 proto kernel metric 256 pref medium fe80::/64 dev peerethtest1 proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 1024 pref medium fe80::/64 dev ethtest0 proto kernel metric 1024 pref medium fe80::/64 dev ethtest1 proto kernel metric 1024 pref medium 28023 1726853629.49787: no more pending results, returning what we have 28023 1726853629.49790: results queue empty 28023 1726853629.49791: checking for any_errors_fatal 28023 1726853629.49800: done checking for any_errors_fatal 28023 1726853629.49800: checking for max_fail_percentage 28023 1726853629.49802: done checking for max_fail_percentage 28023 1726853629.49803: checking to see if all hosts have failed and the running result is not ok 28023 1726853629.49804: done checking to see if all hosts have failed 28023 1726853629.49805: getting the remaining hosts for this loop 28023 1726853629.49807: done getting the remaining hosts for this loop 28023 1726853629.49810: getting the next task for host managed_node3 28023 1726853629.49816: done getting next task for host managed_node3 28023 1726853629.49818: ^ task is: TASK: Assert that the route table main contains the specified IPv6 routes 28023 1726853629.49820: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853629.49823: getting variables 28023 1726853629.49824: in VariableManager get_vars() 28023 1726853629.49861: Calling all_inventory to load vars for managed_node3 28023 1726853629.49864: Calling groups_inventory to load vars for managed_node3 28023 1726853629.49866: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853629.49973: Calling all_plugins_play to load vars for managed_node3 28023 1726853629.49978: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853629.49986: Calling groups_plugins_play to load vars for managed_node3 28023 1726853629.51943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853629.52846: done with get_vars() 28023 1726853629.52863: done getting variables 28023 1726853629.52907: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the route table main contains the specified IPv6 routes] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:94 Friday 20 September 2024 13:33:49 -0400 (0:00:00.408) 0:00:21.613 ****** 28023 1726853629.52929: entering _queue_task() for managed_node3/assert 28023 1726853629.53180: worker is 1 (out of 1 available) 28023 1726853629.53195: exiting _queue_task() for managed_node3/assert 28023 1726853629.53208: done queuing things up, now waiting for results queue to drain 28023 1726853629.53209: waiting for pending results... 28023 1726853629.53399: running TaskExecutor() for managed_node3/TASK: Assert that the route table main contains the specified IPv6 routes 28023 1726853629.53475: in run() - task 02083763-bbaf-fdb6-dad7-000000000063 28023 1726853629.53488: variable 'ansible_search_path' from source: unknown 28023 1726853629.53517: calling self._execute() 28023 1726853629.53601: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853629.53605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853629.53613: variable 'omit' from source: magic vars 28023 1726853629.53984: variable 'ansible_distribution_major_version' from source: facts 28023 1726853629.53987: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853629.53990: variable 'omit' from source: magic vars 28023 1726853629.54005: variable 'omit' from source: magic vars 28023 1726853629.54177: variable 'omit' from source: magic vars 28023 1726853629.54181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853629.54184: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853629.54187: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853629.54189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853629.54190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853629.54216: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853629.54225: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853629.54231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853629.54330: Set connection var ansible_shell_type to sh 28023 1726853629.54342: Set connection var ansible_shell_executable to /bin/sh 28023 1726853629.54350: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853629.54362: Set connection var ansible_connection to ssh 28023 1726853629.54375: Set connection var ansible_pipelining to False 28023 1726853629.54386: Set connection var ansible_timeout to 10 28023 1726853629.54416: variable 'ansible_shell_executable' from source: unknown 28023 1726853629.54423: variable 'ansible_connection' from source: unknown 28023 1726853629.54430: variable 'ansible_module_compression' from source: unknown 28023 1726853629.54436: variable 'ansible_shell_type' from source: unknown 28023 1726853629.54441: variable 'ansible_shell_executable' from source: unknown 28023 1726853629.54447: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853629.54453: variable 'ansible_pipelining' from source: unknown 28023 1726853629.54462: variable 'ansible_timeout' from source: unknown 28023 1726853629.54470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853629.54616: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853629.54636: variable 'omit' from source: magic vars 28023 1726853629.54647: starting attempt loop 28023 1726853629.54661: running the handler 28023 1726853629.54894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853629.55074: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853629.55112: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853629.55221: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853629.55244: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853629.55324: variable 'route_table_main_ipv6' from source: set_fact 28023 1726853629.55349: Evaluated conditional (route_table_main_ipv6.stdout is search("2001:db6::4 via 2001:db8::1 dev ethtest0\s+(proto static )?metric 2")): True 28023 1726853629.55353: handler run complete 28023 1726853629.55370: attempt loop complete, returning result 28023 1726853629.55391: _execute() done 28023 1726853629.55395: dumping result to json 28023 1726853629.55397: done dumping result, returning 28023 1726853629.55399: done running TaskExecutor() for managed_node3/TASK: Assert that the route table main contains the specified IPv6 routes [02083763-bbaf-fdb6-dad7-000000000063] 28023 1726853629.55428: sending task result for task 02083763-bbaf-fdb6-dad7-000000000063 28023 1726853629.55515: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000063 28023 1726853629.55518: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 28023 1726853629.55566: no more pending results, returning what we have 28023 1726853629.55569: results queue empty 28023 1726853629.55570: checking for any_errors_fatal 28023 1726853629.55586: done checking for any_errors_fatal 28023 1726853629.55587: checking for max_fail_percentage 28023 1726853629.55590: done checking for max_fail_percentage 28023 1726853629.55591: checking to see if all hosts have failed and the running result is not ok 28023 1726853629.55592: done checking to see if all hosts have failed 28023 1726853629.55593: getting the remaining hosts for this loop 28023 1726853629.55594: done getting the remaining hosts for this loop 28023 1726853629.55598: getting the next task for host managed_node3 28023 1726853629.55604: done getting next task for host managed_node3 28023 1726853629.55606: ^ task is: TASK: Get the interface1 MAC address 28023 1726853629.55609: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853629.55612: getting variables 28023 1726853629.55614: in VariableManager get_vars() 28023 1726853629.55662: Calling all_inventory to load vars for managed_node3 28023 1726853629.55665: Calling groups_inventory to load vars for managed_node3 28023 1726853629.55667: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853629.55680: Calling all_plugins_play to load vars for managed_node3 28023 1726853629.55682: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853629.55685: Calling groups_plugins_play to load vars for managed_node3 28023 1726853629.56623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853629.58027: done with get_vars() 28023 1726853629.58048: done getting variables 28023 1726853629.58118: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the interface1 MAC address] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:99 Friday 20 September 2024 13:33:49 -0400 (0:00:00.052) 0:00:21.665 ****** 28023 1726853629.58151: entering _queue_task() for managed_node3/command 28023 1726853629.58509: worker is 1 (out of 1 available) 28023 1726853629.58524: exiting _queue_task() for managed_node3/command 28023 1726853629.58537: done queuing things up, now waiting for results queue to drain 28023 1726853629.58538: waiting for pending results... 28023 1726853629.58866: running TaskExecutor() for managed_node3/TASK: Get the interface1 MAC address 28023 1726853629.58951: in run() - task 02083763-bbaf-fdb6-dad7-000000000064 28023 1726853629.58977: variable 'ansible_search_path' from source: unknown 28023 1726853629.59084: calling self._execute() 28023 1726853629.59176: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853629.59180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853629.59183: variable 'omit' from source: magic vars 28023 1726853629.59551: variable 'ansible_distribution_major_version' from source: facts 28023 1726853629.59576: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853629.59582: variable 'omit' from source: magic vars 28023 1726853629.59606: variable 'omit' from source: magic vars 28023 1726853629.59724: variable 'interface1' from source: play vars 28023 1726853629.59748: variable 'omit' from source: magic vars 28023 1726853629.59788: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853629.59814: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853629.59832: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853629.59847: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853629.59857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853629.59890: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853629.59893: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853629.59895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853629.59963: Set connection var ansible_shell_type to sh 28023 1726853629.59981: Set connection var ansible_shell_executable to /bin/sh 28023 1726853629.59985: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853629.59987: Set connection var ansible_connection to ssh 28023 1726853629.59989: Set connection var ansible_pipelining to False 28023 1726853629.59992: Set connection var ansible_timeout to 10 28023 1726853629.60061: variable 'ansible_shell_executable' from source: unknown 28023 1726853629.60064: variable 'ansible_connection' from source: unknown 28023 1726853629.60067: variable 'ansible_module_compression' from source: unknown 28023 1726853629.60069: variable 'ansible_shell_type' from source: unknown 28023 1726853629.60073: variable 'ansible_shell_executable' from source: unknown 28023 1726853629.60075: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853629.60078: variable 'ansible_pipelining' from source: unknown 28023 1726853629.60081: variable 'ansible_timeout' from source: unknown 28023 1726853629.60083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853629.60163: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853629.60176: variable 'omit' from source: magic vars 28023 1726853629.60181: starting attempt loop 28023 1726853629.60184: running the handler 28023 1726853629.60198: _low_level_execute_command(): starting 28023 1726853629.60205: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853629.60891: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853629.60898: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28023 1726853629.60903: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853629.60962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853629.60966: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853629.61034: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853629.62738: stdout chunk (state=3): >>>/root <<< 28023 1726853629.62846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853629.62900: stderr chunk (state=3): >>><<< 28023 1726853629.62904: stdout chunk (state=3): >>><<< 28023 1726853629.62920: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853629.62932: _low_level_execute_command(): starting 28023 1726853629.62938: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853629.6292062-29057-193076601038929 `" && echo ansible-tmp-1726853629.6292062-29057-193076601038929="` echo /root/.ansible/tmp/ansible-tmp-1726853629.6292062-29057-193076601038929 `" ) && sleep 0' 28023 1726853629.63448: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853629.63451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853629.63454: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853629.63464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853629.63511: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853629.63521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853629.63608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853629.65559: stdout chunk (state=3): >>>ansible-tmp-1726853629.6292062-29057-193076601038929=/root/.ansible/tmp/ansible-tmp-1726853629.6292062-29057-193076601038929 <<< 28023 1726853629.65668: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853629.65696: stderr chunk (state=3): >>><<< 28023 1726853629.65700: stdout chunk (state=3): >>><<< 28023 1726853629.65714: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853629.6292062-29057-193076601038929=/root/.ansible/tmp/ansible-tmp-1726853629.6292062-29057-193076601038929 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853629.65746: variable 'ansible_module_compression' from source: unknown 28023 1726853629.65788: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28023 1726853629.65826: variable 'ansible_facts' from source: unknown 28023 1726853629.65892: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853629.6292062-29057-193076601038929/AnsiballZ_command.py 28023 1726853629.65995: Sending initial data 28023 1726853629.65998: Sent initial data (156 bytes) 28023 1726853629.66440: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853629.66443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853629.66446: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853629.66448: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853629.66450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853629.66513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853629.66517: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853629.66519: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853629.66569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853629.68176: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853629.68242: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853629.68327: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpry43esua /root/.ansible/tmp/ansible-tmp-1726853629.6292062-29057-193076601038929/AnsiballZ_command.py <<< 28023 1726853629.68331: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853629.6292062-29057-193076601038929/AnsiballZ_command.py" <<< 28023 1726853629.68373: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpry43esua" to remote "/root/.ansible/tmp/ansible-tmp-1726853629.6292062-29057-193076601038929/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853629.6292062-29057-193076601038929/AnsiballZ_command.py" <<< 28023 1726853629.69051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853629.69136: stderr chunk (state=3): >>><<< 28023 1726853629.69139: stdout chunk (state=3): >>><<< 28023 1726853629.69147: done transferring module to remote 28023 1726853629.69156: _low_level_execute_command(): starting 28023 1726853629.69165: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853629.6292062-29057-193076601038929/ /root/.ansible/tmp/ansible-tmp-1726853629.6292062-29057-193076601038929/AnsiballZ_command.py && sleep 0' 28023 1726853629.69682: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853629.69685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853629.69687: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853629.69689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853629.69732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853629.69744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853629.69808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853629.71633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853629.71656: stderr chunk (state=3): >>><<< 28023 1726853629.71659: stdout chunk (state=3): >>><<< 28023 1726853629.71680: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853629.71683: _low_level_execute_command(): starting 28023 1726853629.71688: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853629.6292062-29057-193076601038929/AnsiballZ_command.py && sleep 0' 28023 1726853629.72152: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853629.72156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853629.72158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853629.72160: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853629.72162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853629.72217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853629.72221: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853629.72295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853629.88304: stdout chunk (state=3): >>> {"changed": true, "stdout": "32:38:a6:2e:17:d5", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/ethtest1/address"], "start": "2024-09-20 13:33:49.877681", "end": "2024-09-20 13:33:49.881010", "delta": "0:00:00.003329", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/ethtest1/address", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28023 1726853629.89840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853629.89854: stderr chunk (state=3): >>><<< 28023 1726853629.89857: stdout chunk (state=3): >>><<< 28023 1726853629.89878: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "32:38:a6:2e:17:d5", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/ethtest1/address"], "start": "2024-09-20 13:33:49.877681", "end": "2024-09-20 13:33:49.881010", "delta": "0:00:00.003329", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/ethtest1/address", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853629.89911: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/ethtest1/address', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853629.6292062-29057-193076601038929/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853629.89917: _low_level_execute_command(): starting 28023 1726853629.89922: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853629.6292062-29057-193076601038929/ > /dev/null 2>&1 && sleep 0' 28023 1726853629.90517: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853629.90521: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853629.90579: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853629.90623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853629.92530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853629.92533: stdout chunk (state=3): >>><<< 28023 1726853629.92777: stderr chunk (state=3): >>><<< 28023 1726853629.92781: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853629.92783: handler run complete 28023 1726853629.92785: Evaluated conditional (False): False 28023 1726853629.92787: attempt loop complete, returning result 28023 1726853629.92789: _execute() done 28023 1726853629.92791: dumping result to json 28023 1726853629.92793: done dumping result, returning 28023 1726853629.92794: done running TaskExecutor() for managed_node3/TASK: Get the interface1 MAC address [02083763-bbaf-fdb6-dad7-000000000064] 28023 1726853629.92796: sending task result for task 02083763-bbaf-fdb6-dad7-000000000064 28023 1726853629.92878: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000064 28023 1726853629.92881: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "cat", "/sys/class/net/ethtest1/address" ], "delta": "0:00:00.003329", "end": "2024-09-20 13:33:49.881010", "rc": 0, "start": "2024-09-20 13:33:49.877681" } STDOUT: 32:38:a6:2e:17:d5 28023 1726853629.92956: no more pending results, returning what we have 28023 1726853629.92960: results queue empty 28023 1726853629.92960: checking for any_errors_fatal 28023 1726853629.92972: done checking for any_errors_fatal 28023 1726853629.92974: checking for max_fail_percentage 28023 1726853629.92976: done checking for max_fail_percentage 28023 1726853629.92977: checking to see if all hosts have failed and the running result is not ok 28023 1726853629.92978: done checking to see if all hosts have failed 28023 1726853629.92979: getting the remaining hosts for this loop 28023 1726853629.92981: done getting the remaining hosts for this loop 28023 1726853629.92984: getting the next task for host managed_node3 28023 1726853629.92993: done getting next task for host managed_node3 28023 1726853629.92998: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28023 1726853629.93002: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853629.93027: getting variables 28023 1726853629.93029: in VariableManager get_vars() 28023 1726853629.93107: Calling all_inventory to load vars for managed_node3 28023 1726853629.93110: Calling groups_inventory to load vars for managed_node3 28023 1726853629.93113: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853629.93125: Calling all_plugins_play to load vars for managed_node3 28023 1726853629.93128: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853629.93131: Calling groups_plugins_play to load vars for managed_node3 28023 1726853629.95149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853629.97933: done with get_vars() 28023 1726853629.98001: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:33:49 -0400 (0:00:00.400) 0:00:22.065 ****** 28023 1726853629.98182: entering _queue_task() for managed_node3/include_tasks 28023 1726853629.98539: worker is 1 (out of 1 available) 28023 1726853629.98559: exiting _queue_task() for managed_node3/include_tasks 28023 1726853629.98574: done queuing things up, now waiting for results queue to drain 28023 1726853629.98577: waiting for pending results... 28023 1726853629.98789: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28023 1726853629.98977: in run() - task 02083763-bbaf-fdb6-dad7-00000000006c 28023 1726853629.98980: variable 'ansible_search_path' from source: unknown 28023 1726853629.98994: variable 'ansible_search_path' from source: unknown 28023 1726853629.99021: calling self._execute() 28023 1726853629.99128: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853629.99134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853629.99143: variable 'omit' from source: magic vars 28023 1726853629.99439: variable 'ansible_distribution_major_version' from source: facts 28023 1726853629.99447: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853629.99453: _execute() done 28023 1726853629.99459: dumping result to json 28023 1726853629.99463: done dumping result, returning 28023 1726853629.99465: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-fdb6-dad7-00000000006c] 28023 1726853629.99472: sending task result for task 02083763-bbaf-fdb6-dad7-00000000006c 28023 1726853629.99565: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000006c 28023 1726853629.99568: WORKER PROCESS EXITING 28023 1726853629.99608: no more pending results, returning what we have 28023 1726853629.99612: in VariableManager get_vars() 28023 1726853629.99661: Calling all_inventory to load vars for managed_node3 28023 1726853629.99664: Calling groups_inventory to load vars for managed_node3 28023 1726853629.99667: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853629.99680: Calling all_plugins_play to load vars for managed_node3 28023 1726853629.99683: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853629.99686: Calling groups_plugins_play to load vars for managed_node3 28023 1726853630.01169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853630.02316: done with get_vars() 28023 1726853630.02331: variable 'ansible_search_path' from source: unknown 28023 1726853630.02332: variable 'ansible_search_path' from source: unknown 28023 1726853630.02361: we have included files to process 28023 1726853630.02362: generating all_blocks data 28023 1726853630.02364: done generating all_blocks data 28023 1726853630.02368: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28023 1726853630.02368: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28023 1726853630.02370: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28023 1726853630.02747: done processing included file 28023 1726853630.02749: iterating over new_blocks loaded from include file 28023 1726853630.02750: in VariableManager get_vars() 28023 1726853630.02768: done with get_vars() 28023 1726853630.02770: filtering new block on tags 28023 1726853630.02784: done filtering new block on tags 28023 1726853630.02786: in VariableManager get_vars() 28023 1726853630.02800: done with get_vars() 28023 1726853630.02801: filtering new block on tags 28023 1726853630.02813: done filtering new block on tags 28023 1726853630.02814: in VariableManager get_vars() 28023 1726853630.02827: done with get_vars() 28023 1726853630.02828: filtering new block on tags 28023 1726853630.02838: done filtering new block on tags 28023 1726853630.02839: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 28023 1726853630.02843: extending task lists for all hosts with included blocks 28023 1726853630.03378: done extending task lists 28023 1726853630.03379: done processing included files 28023 1726853630.03380: results queue empty 28023 1726853630.03380: checking for any_errors_fatal 28023 1726853630.03383: done checking for any_errors_fatal 28023 1726853630.03383: checking for max_fail_percentage 28023 1726853630.03384: done checking for max_fail_percentage 28023 1726853630.03385: checking to see if all hosts have failed and the running result is not ok 28023 1726853630.03385: done checking to see if all hosts have failed 28023 1726853630.03386: getting the remaining hosts for this loop 28023 1726853630.03387: done getting the remaining hosts for this loop 28023 1726853630.03388: getting the next task for host managed_node3 28023 1726853630.03391: done getting next task for host managed_node3 28023 1726853630.03393: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28023 1726853630.03394: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853630.03401: getting variables 28023 1726853630.03401: in VariableManager get_vars() 28023 1726853630.03412: Calling all_inventory to load vars for managed_node3 28023 1726853630.03413: Calling groups_inventory to load vars for managed_node3 28023 1726853630.03414: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853630.03418: Calling all_plugins_play to load vars for managed_node3 28023 1726853630.03419: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853630.03421: Calling groups_plugins_play to load vars for managed_node3 28023 1726853630.04523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853630.06376: done with get_vars() 28023 1726853630.06392: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:33:50 -0400 (0:00:00.082) 0:00:22.148 ****** 28023 1726853630.06443: entering _queue_task() for managed_node3/setup 28023 1726853630.06745: worker is 1 (out of 1 available) 28023 1726853630.06763: exiting _queue_task() for managed_node3/setup 28023 1726853630.06779: done queuing things up, now waiting for results queue to drain 28023 1726853630.06781: waiting for pending results... 28023 1726853630.07092: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28023 1726853630.07243: in run() - task 02083763-bbaf-fdb6-dad7-000000000563 28023 1726853630.07253: variable 'ansible_search_path' from source: unknown 28023 1726853630.07259: variable 'ansible_search_path' from source: unknown 28023 1726853630.07289: calling self._execute() 28023 1726853630.07407: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853630.07415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853630.07426: variable 'omit' from source: magic vars 28023 1726853630.07860: variable 'ansible_distribution_major_version' from source: facts 28023 1726853630.07863: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853630.08148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853630.10875: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853630.10957: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853630.11003: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853630.11028: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853630.11053: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853630.11120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853630.11141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853630.11240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853630.11243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853630.11246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853630.11303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853630.11307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853630.11316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853630.11411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853630.11414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853630.11520: variable '__network_required_facts' from source: role '' defaults 28023 1726853630.11526: variable 'ansible_facts' from source: unknown 28023 1726853630.12205: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28023 1726853630.12209: when evaluation is False, skipping this task 28023 1726853630.12212: _execute() done 28023 1726853630.12214: dumping result to json 28023 1726853630.12217: done dumping result, returning 28023 1726853630.12228: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-fdb6-dad7-000000000563] 28023 1726853630.12230: sending task result for task 02083763-bbaf-fdb6-dad7-000000000563 28023 1726853630.12331: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000563 28023 1726853630.12334: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28023 1726853630.12384: no more pending results, returning what we have 28023 1726853630.12387: results queue empty 28023 1726853630.12388: checking for any_errors_fatal 28023 1726853630.12389: done checking for any_errors_fatal 28023 1726853630.12390: checking for max_fail_percentage 28023 1726853630.12391: done checking for max_fail_percentage 28023 1726853630.12392: checking to see if all hosts have failed and the running result is not ok 28023 1726853630.12393: done checking to see if all hosts have failed 28023 1726853630.12394: getting the remaining hosts for this loop 28023 1726853630.12395: done getting the remaining hosts for this loop 28023 1726853630.12399: getting the next task for host managed_node3 28023 1726853630.12407: done getting next task for host managed_node3 28023 1726853630.12411: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28023 1726853630.12415: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853630.12431: getting variables 28023 1726853630.12433: in VariableManager get_vars() 28023 1726853630.12479: Calling all_inventory to load vars for managed_node3 28023 1726853630.12482: Calling groups_inventory to load vars for managed_node3 28023 1726853630.12484: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853630.12495: Calling all_plugins_play to load vars for managed_node3 28023 1726853630.12497: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853630.12499: Calling groups_plugins_play to load vars for managed_node3 28023 1726853630.13463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853630.14949: done with get_vars() 28023 1726853630.15173: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:33:50 -0400 (0:00:00.088) 0:00:22.236 ****** 28023 1726853630.15376: entering _queue_task() for managed_node3/stat 28023 1726853630.16113: worker is 1 (out of 1 available) 28023 1726853630.16129: exiting _queue_task() for managed_node3/stat 28023 1726853630.16192: done queuing things up, now waiting for results queue to drain 28023 1726853630.16194: waiting for pending results... 28023 1726853630.16491: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 28023 1726853630.16623: in run() - task 02083763-bbaf-fdb6-dad7-000000000565 28023 1726853630.16627: variable 'ansible_search_path' from source: unknown 28023 1726853630.16631: variable 'ansible_search_path' from source: unknown 28023 1726853630.16667: calling self._execute() 28023 1726853630.16840: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853630.16848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853630.16852: variable 'omit' from source: magic vars 28023 1726853630.17322: variable 'ansible_distribution_major_version' from source: facts 28023 1726853630.17344: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853630.17588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853630.18043: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853630.18118: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853630.18164: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853630.18207: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853630.18328: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853630.18346: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853630.18395: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853630.18436: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853630.18545: variable '__network_is_ostree' from source: set_fact 28023 1726853630.18548: Evaluated conditional (not __network_is_ostree is defined): False 28023 1726853630.18550: when evaluation is False, skipping this task 28023 1726853630.18559: _execute() done 28023 1726853630.18589: dumping result to json 28023 1726853630.18592: done dumping result, returning 28023 1726853630.18594: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-fdb6-dad7-000000000565] 28023 1726853630.18652: sending task result for task 02083763-bbaf-fdb6-dad7-000000000565 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28023 1726853630.18853: no more pending results, returning what we have 28023 1726853630.18860: results queue empty 28023 1726853630.18862: checking for any_errors_fatal 28023 1726853630.18873: done checking for any_errors_fatal 28023 1726853630.18874: checking for max_fail_percentage 28023 1726853630.18875: done checking for max_fail_percentage 28023 1726853630.18876: checking to see if all hosts have failed and the running result is not ok 28023 1726853630.18877: done checking to see if all hosts have failed 28023 1726853630.18878: getting the remaining hosts for this loop 28023 1726853630.18880: done getting the remaining hosts for this loop 28023 1726853630.18883: getting the next task for host managed_node3 28023 1726853630.18889: done getting next task for host managed_node3 28023 1726853630.18893: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28023 1726853630.18897: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853630.18916: getting variables 28023 1726853630.18917: in VariableManager get_vars() 28023 1726853630.18959: Calling all_inventory to load vars for managed_node3 28023 1726853630.18962: Calling groups_inventory to load vars for managed_node3 28023 1726853630.18964: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853630.19001: Calling all_plugins_play to load vars for managed_node3 28023 1726853630.19004: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853630.19009: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000565 28023 1726853630.19011: WORKER PROCESS EXITING 28023 1726853630.19014: Calling groups_plugins_play to load vars for managed_node3 28023 1726853630.19864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853630.20835: done with get_vars() 28023 1726853630.20851: done getting variables 28023 1726853630.20893: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:33:50 -0400 (0:00:00.056) 0:00:22.293 ****** 28023 1726853630.20917: entering _queue_task() for managed_node3/set_fact 28023 1726853630.21143: worker is 1 (out of 1 available) 28023 1726853630.21157: exiting _queue_task() for managed_node3/set_fact 28023 1726853630.21170: done queuing things up, now waiting for results queue to drain 28023 1726853630.21173: waiting for pending results... 28023 1726853630.21359: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28023 1726853630.21566: in run() - task 02083763-bbaf-fdb6-dad7-000000000566 28023 1726853630.21574: variable 'ansible_search_path' from source: unknown 28023 1726853630.21582: variable 'ansible_search_path' from source: unknown 28023 1726853630.21660: calling self._execute() 28023 1726853630.21977: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853630.21981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853630.21984: variable 'omit' from source: magic vars 28023 1726853630.22191: variable 'ansible_distribution_major_version' from source: facts 28023 1726853630.22243: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853630.22504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853630.22703: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853630.22734: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853630.22762: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853630.22792: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853630.22848: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853630.22870: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853630.22890: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853630.22907: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853630.22968: variable '__network_is_ostree' from source: set_fact 28023 1726853630.22974: Evaluated conditional (not __network_is_ostree is defined): False 28023 1726853630.22977: when evaluation is False, skipping this task 28023 1726853630.22979: _execute() done 28023 1726853630.22982: dumping result to json 28023 1726853630.22993: done dumping result, returning 28023 1726853630.22996: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-fdb6-dad7-000000000566] 28023 1726853630.22999: sending task result for task 02083763-bbaf-fdb6-dad7-000000000566 28023 1726853630.23077: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000566 28023 1726853630.23079: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28023 1726853630.23140: no more pending results, returning what we have 28023 1726853630.23143: results queue empty 28023 1726853630.23144: checking for any_errors_fatal 28023 1726853630.23153: done checking for any_errors_fatal 28023 1726853630.23154: checking for max_fail_percentage 28023 1726853630.23155: done checking for max_fail_percentage 28023 1726853630.23156: checking to see if all hosts have failed and the running result is not ok 28023 1726853630.23160: done checking to see if all hosts have failed 28023 1726853630.23160: getting the remaining hosts for this loop 28023 1726853630.23162: done getting the remaining hosts for this loop 28023 1726853630.23165: getting the next task for host managed_node3 28023 1726853630.23175: done getting next task for host managed_node3 28023 1726853630.23179: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28023 1726853630.23182: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853630.23196: getting variables 28023 1726853630.23197: in VariableManager get_vars() 28023 1726853630.23231: Calling all_inventory to load vars for managed_node3 28023 1726853630.23234: Calling groups_inventory to load vars for managed_node3 28023 1726853630.23236: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853630.23244: Calling all_plugins_play to load vars for managed_node3 28023 1726853630.23246: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853630.23249: Calling groups_plugins_play to load vars for managed_node3 28023 1726853630.24318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853630.26948: done with get_vars() 28023 1726853630.27094: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:33:50 -0400 (0:00:00.063) 0:00:22.356 ****** 28023 1726853630.27292: entering _queue_task() for managed_node3/service_facts 28023 1726853630.27831: worker is 1 (out of 1 available) 28023 1726853630.27962: exiting _queue_task() for managed_node3/service_facts 28023 1726853630.27983: done queuing things up, now waiting for results queue to drain 28023 1726853630.27984: waiting for pending results... 28023 1726853630.28222: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 28023 1726853630.28479: in run() - task 02083763-bbaf-fdb6-dad7-000000000568 28023 1726853630.28483: variable 'ansible_search_path' from source: unknown 28023 1726853630.28486: variable 'ansible_search_path' from source: unknown 28023 1726853630.28489: calling self._execute() 28023 1726853630.28625: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853630.28639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853630.28653: variable 'omit' from source: magic vars 28023 1726853630.29087: variable 'ansible_distribution_major_version' from source: facts 28023 1726853630.29104: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853630.29114: variable 'omit' from source: magic vars 28023 1726853630.29238: variable 'omit' from source: magic vars 28023 1726853630.29254: variable 'omit' from source: magic vars 28023 1726853630.29315: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853630.29362: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853630.29454: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853630.29460: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853630.29462: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853630.29492: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853630.29503: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853630.29510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853630.29626: Set connection var ansible_shell_type to sh 28023 1726853630.29676: Set connection var ansible_shell_executable to /bin/sh 28023 1726853630.29679: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853630.29682: Set connection var ansible_connection to ssh 28023 1726853630.29684: Set connection var ansible_pipelining to False 28023 1726853630.29686: Set connection var ansible_timeout to 10 28023 1726853630.29712: variable 'ansible_shell_executable' from source: unknown 28023 1726853630.29822: variable 'ansible_connection' from source: unknown 28023 1726853630.29826: variable 'ansible_module_compression' from source: unknown 28023 1726853630.29828: variable 'ansible_shell_type' from source: unknown 28023 1726853630.29830: variable 'ansible_shell_executable' from source: unknown 28023 1726853630.29832: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853630.29835: variable 'ansible_pipelining' from source: unknown 28023 1726853630.29837: variable 'ansible_timeout' from source: unknown 28023 1726853630.29839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853630.29990: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28023 1726853630.30007: variable 'omit' from source: magic vars 28023 1726853630.30018: starting attempt loop 28023 1726853630.30025: running the handler 28023 1726853630.30060: _low_level_execute_command(): starting 28023 1726853630.30078: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853630.30937: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853630.30994: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853630.31011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853630.31062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853630.31165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853630.32870: stdout chunk (state=3): >>>/root <<< 28023 1726853630.33034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853630.33038: stdout chunk (state=3): >>><<< 28023 1726853630.33041: stderr chunk (state=3): >>><<< 28023 1726853630.33060: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853630.33162: _low_level_execute_command(): starting 28023 1726853630.33166: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853630.3306746-29087-183384221614470 `" && echo ansible-tmp-1726853630.3306746-29087-183384221614470="` echo /root/.ansible/tmp/ansible-tmp-1726853630.3306746-29087-183384221614470 `" ) && sleep 0' 28023 1726853630.33668: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853630.33680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853630.33711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853630.33715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853630.33718: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853630.33776: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853630.33786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853630.33789: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28023 1726853630.33791: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 28023 1726853630.33793: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28023 1726853630.33795: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853630.33797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853630.33860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853630.33865: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853630.33961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853630.35928: stdout chunk (state=3): >>>ansible-tmp-1726853630.3306746-29087-183384221614470=/root/.ansible/tmp/ansible-tmp-1726853630.3306746-29087-183384221614470 <<< 28023 1726853630.36092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853630.36095: stdout chunk (state=3): >>><<< 28023 1726853630.36097: stderr chunk (state=3): >>><<< 28023 1726853630.36112: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853630.3306746-29087-183384221614470=/root/.ansible/tmp/ansible-tmp-1726853630.3306746-29087-183384221614470 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853630.36280: variable 'ansible_module_compression' from source: unknown 28023 1726853630.36283: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 28023 1726853630.36285: variable 'ansible_facts' from source: unknown 28023 1726853630.36374: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853630.3306746-29087-183384221614470/AnsiballZ_service_facts.py 28023 1726853630.36526: Sending initial data 28023 1726853630.36629: Sent initial data (162 bytes) 28023 1726853630.37238: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853630.37253: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853630.37293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853630.37401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853630.37427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853630.37525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853630.39131: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853630.39184: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853630.39248: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmp58cwdv06 /root/.ansible/tmp/ansible-tmp-1726853630.3306746-29087-183384221614470/AnsiballZ_service_facts.py <<< 28023 1726853630.39251: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853630.3306746-29087-183384221614470/AnsiballZ_service_facts.py" <<< 28023 1726853630.39304: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmp58cwdv06" to remote "/root/.ansible/tmp/ansible-tmp-1726853630.3306746-29087-183384221614470/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853630.3306746-29087-183384221614470/AnsiballZ_service_facts.py" <<< 28023 1726853630.39918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853630.39951: stderr chunk (state=3): >>><<< 28023 1726853630.39954: stdout chunk (state=3): >>><<< 28023 1726853630.39970: done transferring module to remote 28023 1726853630.39982: _low_level_execute_command(): starting 28023 1726853630.39990: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853630.3306746-29087-183384221614470/ /root/.ansible/tmp/ansible-tmp-1726853630.3306746-29087-183384221614470/AnsiballZ_service_facts.py && sleep 0' 28023 1726853630.40626: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853630.40635: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853630.40691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853630.42512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853630.42538: stderr chunk (state=3): >>><<< 28023 1726853630.42541: stdout chunk (state=3): >>><<< 28023 1726853630.42553: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853630.42559: _low_level_execute_command(): starting 28023 1726853630.42563: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853630.3306746-29087-183384221614470/AnsiballZ_service_facts.py && sleep 0' 28023 1726853630.42990: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853630.42993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853630.42996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853630.43001: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853630.43052: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853630.43060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853630.43062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853630.43126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853632.00332: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 28023 1726853632.00339: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 28023 1726853632.00358: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 28023 1726853632.00382: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integratio<<< 28023 1726853632.00413: stdout chunk (state=3): >>>n.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28023 1726853632.01990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853632.02024: stderr chunk (state=3): >>><<< 28023 1726853632.02027: stdout chunk (state=3): >>><<< 28023 1726853632.02052: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853632.02489: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853630.3306746-29087-183384221614470/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853632.02497: _low_level_execute_command(): starting 28023 1726853632.02502: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853630.3306746-29087-183384221614470/ > /dev/null 2>&1 && sleep 0' 28023 1726853632.02935: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853632.02942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853632.02976: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853632.02979: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853632.02981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853632.02983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853632.03039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853632.03042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853632.03077: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853632.03132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853632.05009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853632.05038: stderr chunk (state=3): >>><<< 28023 1726853632.05041: stdout chunk (state=3): >>><<< 28023 1726853632.05054: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853632.05061: handler run complete 28023 1726853632.05183: variable 'ansible_facts' from source: unknown 28023 1726853632.05275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853632.05770: variable 'ansible_facts' from source: unknown 28023 1726853632.05849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853632.05962: attempt loop complete, returning result 28023 1726853632.05966: _execute() done 28023 1726853632.05970: dumping result to json 28023 1726853632.06010: done dumping result, returning 28023 1726853632.06018: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-fdb6-dad7-000000000568] 28023 1726853632.06022: sending task result for task 02083763-bbaf-fdb6-dad7-000000000568 28023 1726853632.06729: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000568 28023 1726853632.06732: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28023 1726853632.06785: no more pending results, returning what we have 28023 1726853632.06787: results queue empty 28023 1726853632.06788: checking for any_errors_fatal 28023 1726853632.06790: done checking for any_errors_fatal 28023 1726853632.06791: checking for max_fail_percentage 28023 1726853632.06792: done checking for max_fail_percentage 28023 1726853632.06792: checking to see if all hosts have failed and the running result is not ok 28023 1726853632.06793: done checking to see if all hosts have failed 28023 1726853632.06793: getting the remaining hosts for this loop 28023 1726853632.06794: done getting the remaining hosts for this loop 28023 1726853632.06796: getting the next task for host managed_node3 28023 1726853632.06801: done getting next task for host managed_node3 28023 1726853632.06804: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28023 1726853632.06807: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853632.06814: getting variables 28023 1726853632.06815: in VariableManager get_vars() 28023 1726853632.06837: Calling all_inventory to load vars for managed_node3 28023 1726853632.06839: Calling groups_inventory to load vars for managed_node3 28023 1726853632.06840: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853632.06846: Calling all_plugins_play to load vars for managed_node3 28023 1726853632.06848: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853632.06849: Calling groups_plugins_play to load vars for managed_node3 28023 1726853632.07513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853632.08375: done with get_vars() 28023 1726853632.08391: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:33:52 -0400 (0:00:01.811) 0:00:24.168 ****** 28023 1726853632.08467: entering _queue_task() for managed_node3/package_facts 28023 1726853632.08711: worker is 1 (out of 1 available) 28023 1726853632.08725: exiting _queue_task() for managed_node3/package_facts 28023 1726853632.08738: done queuing things up, now waiting for results queue to drain 28023 1726853632.08739: waiting for pending results... 28023 1726853632.08922: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 28023 1726853632.09030: in run() - task 02083763-bbaf-fdb6-dad7-000000000569 28023 1726853632.09045: variable 'ansible_search_path' from source: unknown 28023 1726853632.09048: variable 'ansible_search_path' from source: unknown 28023 1726853632.09086: calling self._execute() 28023 1726853632.09161: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853632.09168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853632.09178: variable 'omit' from source: magic vars 28023 1726853632.09453: variable 'ansible_distribution_major_version' from source: facts 28023 1726853632.09465: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853632.09473: variable 'omit' from source: magic vars 28023 1726853632.09525: variable 'omit' from source: magic vars 28023 1726853632.09548: variable 'omit' from source: magic vars 28023 1726853632.09584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853632.09612: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853632.09630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853632.09644: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853632.09654: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853632.09691: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853632.09695: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853632.09697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853632.09766: Set connection var ansible_shell_type to sh 28023 1726853632.09773: Set connection var ansible_shell_executable to /bin/sh 28023 1726853632.09779: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853632.09785: Set connection var ansible_connection to ssh 28023 1726853632.09790: Set connection var ansible_pipelining to False 28023 1726853632.09795: Set connection var ansible_timeout to 10 28023 1726853632.09815: variable 'ansible_shell_executable' from source: unknown 28023 1726853632.09818: variable 'ansible_connection' from source: unknown 28023 1726853632.09821: variable 'ansible_module_compression' from source: unknown 28023 1726853632.09823: variable 'ansible_shell_type' from source: unknown 28023 1726853632.09826: variable 'ansible_shell_executable' from source: unknown 28023 1726853632.09828: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853632.09830: variable 'ansible_pipelining' from source: unknown 28023 1726853632.09835: variable 'ansible_timeout' from source: unknown 28023 1726853632.09837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853632.09981: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28023 1726853632.09991: variable 'omit' from source: magic vars 28023 1726853632.09996: starting attempt loop 28023 1726853632.09999: running the handler 28023 1726853632.10011: _low_level_execute_command(): starting 28023 1726853632.10018: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853632.10532: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853632.10536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853632.10541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853632.10591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853632.10594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853632.10597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853632.10667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853632.12354: stdout chunk (state=3): >>>/root <<< 28023 1726853632.12453: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853632.12487: stderr chunk (state=3): >>><<< 28023 1726853632.12490: stdout chunk (state=3): >>><<< 28023 1726853632.12508: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853632.12519: _low_level_execute_command(): starting 28023 1726853632.12524: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853632.1250775-29158-137338285448473 `" && echo ansible-tmp-1726853632.1250775-29158-137338285448473="` echo /root/.ansible/tmp/ansible-tmp-1726853632.1250775-29158-137338285448473 `" ) && sleep 0' 28023 1726853632.12957: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853632.12960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853632.12963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28023 1726853632.12974: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853632.12977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853632.13020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853632.13024: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853632.13090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853632.15011: stdout chunk (state=3): >>>ansible-tmp-1726853632.1250775-29158-137338285448473=/root/.ansible/tmp/ansible-tmp-1726853632.1250775-29158-137338285448473 <<< 28023 1726853632.15122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853632.15149: stderr chunk (state=3): >>><<< 28023 1726853632.15152: stdout chunk (state=3): >>><<< 28023 1726853632.15172: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853632.1250775-29158-137338285448473=/root/.ansible/tmp/ansible-tmp-1726853632.1250775-29158-137338285448473 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853632.15377: variable 'ansible_module_compression' from source: unknown 28023 1726853632.15381: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 28023 1726853632.15387: variable 'ansible_facts' from source: unknown 28023 1726853632.15530: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853632.1250775-29158-137338285448473/AnsiballZ_package_facts.py 28023 1726853632.15635: Sending initial data 28023 1726853632.15639: Sent initial data (162 bytes) 28023 1726853632.16280: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853632.16308: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853632.16339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853632.16470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853632.18076: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 28023 1726853632.18091: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853632.18143: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853632.18203: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpxxerrtwb /root/.ansible/tmp/ansible-tmp-1726853632.1250775-29158-137338285448473/AnsiballZ_package_facts.py <<< 28023 1726853632.18205: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853632.1250775-29158-137338285448473/AnsiballZ_package_facts.py" <<< 28023 1726853632.18262: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpxxerrtwb" to remote "/root/.ansible/tmp/ansible-tmp-1726853632.1250775-29158-137338285448473/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853632.1250775-29158-137338285448473/AnsiballZ_package_facts.py" <<< 28023 1726853632.20050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853632.20053: stderr chunk (state=3): >>><<< 28023 1726853632.20056: stdout chunk (state=3): >>><<< 28023 1726853632.20062: done transferring module to remote 28023 1726853632.20064: _low_level_execute_command(): starting 28023 1726853632.20066: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853632.1250775-29158-137338285448473/ /root/.ansible/tmp/ansible-tmp-1726853632.1250775-29158-137338285448473/AnsiballZ_package_facts.py && sleep 0' 28023 1726853632.20659: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853632.20677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853632.20690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853632.20737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853632.20760: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853632.20810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853632.22785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853632.22789: stdout chunk (state=3): >>><<< 28023 1726853632.22791: stderr chunk (state=3): >>><<< 28023 1726853632.22803: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853632.22881: _low_level_execute_command(): starting 28023 1726853632.22884: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853632.1250775-29158-137338285448473/AnsiballZ_package_facts.py && sleep 0' 28023 1726853632.23383: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853632.23403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853632.23420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853632.23439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853632.23457: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853632.23470: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853632.23524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853632.23588: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853632.23604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853632.23714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853632.68248: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 28023 1726853632.68314: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 28023 1726853632.68339: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 28023 1726853632.68460: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 28023 1726853632.68472: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 28023 1726853632.68476: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 28023 1726853632.68503: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28023 1726853632.70313: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853632.70342: stderr chunk (state=3): >>><<< 28023 1726853632.70344: stdout chunk (state=3): >>><<< 28023 1726853632.70384: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853632.72282: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853632.1250775-29158-137338285448473/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853632.72344: _low_level_execute_command(): starting 28023 1726853632.72349: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853632.1250775-29158-137338285448473/ > /dev/null 2>&1 && sleep 0' 28023 1726853632.73160: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853632.73214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853632.73282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853632.73286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853632.73338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853632.75351: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853632.75359: stderr chunk (state=3): >>><<< 28023 1726853632.75363: stdout chunk (state=3): >>><<< 28023 1726853632.75365: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853632.75368: handler run complete 28023 1726853632.76374: variable 'ansible_facts' from source: unknown 28023 1726853632.76996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853632.78442: variable 'ansible_facts' from source: unknown 28023 1726853632.78686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853632.79074: attempt loop complete, returning result 28023 1726853632.79085: _execute() done 28023 1726853632.79090: dumping result to json 28023 1726853632.79208: done dumping result, returning 28023 1726853632.79218: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-fdb6-dad7-000000000569] 28023 1726853632.79224: sending task result for task 02083763-bbaf-fdb6-dad7-000000000569 28023 1726853632.81643: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000569 28023 1726853632.81647: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28023 1726853632.81808: no more pending results, returning what we have 28023 1726853632.81810: results queue empty 28023 1726853632.81811: checking for any_errors_fatal 28023 1726853632.81815: done checking for any_errors_fatal 28023 1726853632.81815: checking for max_fail_percentage 28023 1726853632.81816: done checking for max_fail_percentage 28023 1726853632.81816: checking to see if all hosts have failed and the running result is not ok 28023 1726853632.81817: done checking to see if all hosts have failed 28023 1726853632.81818: getting the remaining hosts for this loop 28023 1726853632.81819: done getting the remaining hosts for this loop 28023 1726853632.81821: getting the next task for host managed_node3 28023 1726853632.81825: done getting next task for host managed_node3 28023 1726853632.81827: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28023 1726853632.81829: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853632.81840: getting variables 28023 1726853632.81841: in VariableManager get_vars() 28023 1726853632.81887: Calling all_inventory to load vars for managed_node3 28023 1726853632.81890: Calling groups_inventory to load vars for managed_node3 28023 1726853632.81891: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853632.81898: Calling all_plugins_play to load vars for managed_node3 28023 1726853632.81900: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853632.81901: Calling groups_plugins_play to load vars for managed_node3 28023 1726853632.82622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853632.83651: done with get_vars() 28023 1726853632.83678: done getting variables 28023 1726853632.83722: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:33:52 -0400 (0:00:00.752) 0:00:24.921 ****** 28023 1726853632.83747: entering _queue_task() for managed_node3/debug 28023 1726853632.83996: worker is 1 (out of 1 available) 28023 1726853632.84010: exiting _queue_task() for managed_node3/debug 28023 1726853632.84023: done queuing things up, now waiting for results queue to drain 28023 1726853632.84024: waiting for pending results... 28023 1726853632.84213: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 28023 1726853632.84312: in run() - task 02083763-bbaf-fdb6-dad7-00000000006d 28023 1726853632.84325: variable 'ansible_search_path' from source: unknown 28023 1726853632.84329: variable 'ansible_search_path' from source: unknown 28023 1726853632.84365: calling self._execute() 28023 1726853632.84436: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853632.84441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853632.84448: variable 'omit' from source: magic vars 28023 1726853632.84798: variable 'ansible_distribution_major_version' from source: facts 28023 1726853632.84811: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853632.84814: variable 'omit' from source: magic vars 28023 1726853632.84867: variable 'omit' from source: magic vars 28023 1726853632.85004: variable 'network_provider' from source: set_fact 28023 1726853632.85008: variable 'omit' from source: magic vars 28023 1726853632.85060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853632.85081: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853632.85098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853632.85111: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853632.85121: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853632.85151: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853632.85154: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853632.85156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853632.85216: Set connection var ansible_shell_type to sh 28023 1726853632.85223: Set connection var ansible_shell_executable to /bin/sh 28023 1726853632.85229: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853632.85234: Set connection var ansible_connection to ssh 28023 1726853632.85239: Set connection var ansible_pipelining to False 28023 1726853632.85244: Set connection var ansible_timeout to 10 28023 1726853632.85288: variable 'ansible_shell_executable' from source: unknown 28023 1726853632.85292: variable 'ansible_connection' from source: unknown 28023 1726853632.85295: variable 'ansible_module_compression' from source: unknown 28023 1726853632.85297: variable 'ansible_shell_type' from source: unknown 28023 1726853632.85299: variable 'ansible_shell_executable' from source: unknown 28023 1726853632.85301: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853632.85303: variable 'ansible_pipelining' from source: unknown 28023 1726853632.85305: variable 'ansible_timeout' from source: unknown 28023 1726853632.85307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853632.85408: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853632.85416: variable 'omit' from source: magic vars 28023 1726853632.85421: starting attempt loop 28023 1726853632.85425: running the handler 28023 1726853632.85461: handler run complete 28023 1726853632.85475: attempt loop complete, returning result 28023 1726853632.85478: _execute() done 28023 1726853632.85481: dumping result to json 28023 1726853632.85507: done dumping result, returning 28023 1726853632.85510: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-fdb6-dad7-00000000006d] 28023 1726853632.85512: sending task result for task 02083763-bbaf-fdb6-dad7-00000000006d 28023 1726853632.85595: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000006d 28023 1726853632.85598: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 28023 1726853632.85693: no more pending results, returning what we have 28023 1726853632.85696: results queue empty 28023 1726853632.85697: checking for any_errors_fatal 28023 1726853632.85704: done checking for any_errors_fatal 28023 1726853632.85705: checking for max_fail_percentage 28023 1726853632.85706: done checking for max_fail_percentage 28023 1726853632.85707: checking to see if all hosts have failed and the running result is not ok 28023 1726853632.85708: done checking to see if all hosts have failed 28023 1726853632.85709: getting the remaining hosts for this loop 28023 1726853632.85710: done getting the remaining hosts for this loop 28023 1726853632.85713: getting the next task for host managed_node3 28023 1726853632.85719: done getting next task for host managed_node3 28023 1726853632.85722: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28023 1726853632.85725: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853632.85734: getting variables 28023 1726853632.85736: in VariableManager get_vars() 28023 1726853632.85776: Calling all_inventory to load vars for managed_node3 28023 1726853632.85779: Calling groups_inventory to load vars for managed_node3 28023 1726853632.85781: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853632.85791: Calling all_plugins_play to load vars for managed_node3 28023 1726853632.85793: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853632.85796: Calling groups_plugins_play to load vars for managed_node3 28023 1726853632.86949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853632.88131: done with get_vars() 28023 1726853632.88146: done getting variables 28023 1726853632.88189: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:33:52 -0400 (0:00:00.044) 0:00:24.966 ****** 28023 1726853632.88215: entering _queue_task() for managed_node3/fail 28023 1726853632.88564: worker is 1 (out of 1 available) 28023 1726853632.88581: exiting _queue_task() for managed_node3/fail 28023 1726853632.88594: done queuing things up, now waiting for results queue to drain 28023 1726853632.88596: waiting for pending results... 28023 1726853632.89088: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28023 1726853632.89093: in run() - task 02083763-bbaf-fdb6-dad7-00000000006e 28023 1726853632.89097: variable 'ansible_search_path' from source: unknown 28023 1726853632.89100: variable 'ansible_search_path' from source: unknown 28023 1726853632.89104: calling self._execute() 28023 1726853632.89334: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853632.89420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853632.89464: variable 'omit' from source: magic vars 28023 1726853632.90136: variable 'ansible_distribution_major_version' from source: facts 28023 1726853632.90156: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853632.90242: variable 'network_state' from source: role '' defaults 28023 1726853632.90251: Evaluated conditional (network_state != {}): False 28023 1726853632.90254: when evaluation is False, skipping this task 28023 1726853632.90256: _execute() done 28023 1726853632.90262: dumping result to json 28023 1726853632.90265: done dumping result, returning 28023 1726853632.90277: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-fdb6-dad7-00000000006e] 28023 1726853632.90281: sending task result for task 02083763-bbaf-fdb6-dad7-00000000006e skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28023 1726853632.90455: no more pending results, returning what we have 28023 1726853632.90459: results queue empty 28023 1726853632.90460: checking for any_errors_fatal 28023 1726853632.90493: done checking for any_errors_fatal 28023 1726853632.90496: checking for max_fail_percentage 28023 1726853632.90500: done checking for max_fail_percentage 28023 1726853632.90501: checking to see if all hosts have failed and the running result is not ok 28023 1726853632.90502: done checking to see if all hosts have failed 28023 1726853632.90508: getting the remaining hosts for this loop 28023 1726853632.90510: done getting the remaining hosts for this loop 28023 1726853632.90514: getting the next task for host managed_node3 28023 1726853632.90519: done getting next task for host managed_node3 28023 1726853632.90522: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28023 1726853632.90525: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853632.90534: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000006e 28023 1726853632.90537: WORKER PROCESS EXITING 28023 1726853632.90547: getting variables 28023 1726853632.90548: in VariableManager get_vars() 28023 1726853632.90585: Calling all_inventory to load vars for managed_node3 28023 1726853632.90588: Calling groups_inventory to load vars for managed_node3 28023 1726853632.90590: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853632.90599: Calling all_plugins_play to load vars for managed_node3 28023 1726853632.90601: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853632.90603: Calling groups_plugins_play to load vars for managed_node3 28023 1726853632.91821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853632.93264: done with get_vars() 28023 1726853632.93294: done getting variables 28023 1726853632.93376: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:33:52 -0400 (0:00:00.052) 0:00:25.018 ****** 28023 1726853632.93431: entering _queue_task() for managed_node3/fail 28023 1726853632.93736: worker is 1 (out of 1 available) 28023 1726853632.93751: exiting _queue_task() for managed_node3/fail 28023 1726853632.93767: done queuing things up, now waiting for results queue to drain 28023 1726853632.93768: waiting for pending results... 28023 1726853632.93982: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28023 1726853632.94068: in run() - task 02083763-bbaf-fdb6-dad7-00000000006f 28023 1726853632.94087: variable 'ansible_search_path' from source: unknown 28023 1726853632.94099: variable 'ansible_search_path' from source: unknown 28023 1726853632.94199: calling self._execute() 28023 1726853632.94238: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853632.94242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853632.94250: variable 'omit' from source: magic vars 28023 1726853632.94552: variable 'ansible_distribution_major_version' from source: facts 28023 1726853632.94565: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853632.94661: variable 'network_state' from source: role '' defaults 28023 1726853632.94669: Evaluated conditional (network_state != {}): False 28023 1726853632.94674: when evaluation is False, skipping this task 28023 1726853632.94677: _execute() done 28023 1726853632.94680: dumping result to json 28023 1726853632.94683: done dumping result, returning 28023 1726853632.94721: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-fdb6-dad7-00000000006f] 28023 1726853632.94726: sending task result for task 02083763-bbaf-fdb6-dad7-00000000006f 28023 1726853632.94802: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000006f 28023 1726853632.94805: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28023 1726853632.94864: no more pending results, returning what we have 28023 1726853632.94868: results queue empty 28023 1726853632.94868: checking for any_errors_fatal 28023 1726853632.94879: done checking for any_errors_fatal 28023 1726853632.94880: checking for max_fail_percentage 28023 1726853632.94881: done checking for max_fail_percentage 28023 1726853632.94882: checking to see if all hosts have failed and the running result is not ok 28023 1726853632.94883: done checking to see if all hosts have failed 28023 1726853632.94884: getting the remaining hosts for this loop 28023 1726853632.94885: done getting the remaining hosts for this loop 28023 1726853632.94888: getting the next task for host managed_node3 28023 1726853632.94894: done getting next task for host managed_node3 28023 1726853632.94898: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28023 1726853632.94900: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853632.94916: getting variables 28023 1726853632.94918: in VariableManager get_vars() 28023 1726853632.94949: Calling all_inventory to load vars for managed_node3 28023 1726853632.94952: Calling groups_inventory to load vars for managed_node3 28023 1726853632.94954: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853632.94965: Calling all_plugins_play to load vars for managed_node3 28023 1726853632.94968: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853632.94970: Calling groups_plugins_play to load vars for managed_node3 28023 1726853632.95888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853632.96907: done with get_vars() 28023 1726853632.96921: done getting variables 28023 1726853632.96962: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:33:52 -0400 (0:00:00.035) 0:00:25.053 ****** 28023 1726853632.96986: entering _queue_task() for managed_node3/fail 28023 1726853632.97206: worker is 1 (out of 1 available) 28023 1726853632.97219: exiting _queue_task() for managed_node3/fail 28023 1726853632.97232: done queuing things up, now waiting for results queue to drain 28023 1726853632.97233: waiting for pending results... 28023 1726853632.97477: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28023 1726853632.97629: in run() - task 02083763-bbaf-fdb6-dad7-000000000070 28023 1726853632.97639: variable 'ansible_search_path' from source: unknown 28023 1726853632.97659: variable 'ansible_search_path' from source: unknown 28023 1726853632.97664: calling self._execute() 28023 1726853632.97763: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853632.97767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853632.97791: variable 'omit' from source: magic vars 28023 1726853632.98072: variable 'ansible_distribution_major_version' from source: facts 28023 1726853632.98081: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853632.98212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853633.00353: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853633.00576: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853633.00579: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853633.00581: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853633.00583: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853633.00599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.00629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.00656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.00704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.00723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.00819: variable 'ansible_distribution_major_version' from source: facts 28023 1726853633.00838: Evaluated conditional (ansible_distribution_major_version | int > 9): True 28023 1726853633.00945: variable 'ansible_distribution' from source: facts 28023 1726853633.00953: variable '__network_rh_distros' from source: role '' defaults 28023 1726853633.00965: Evaluated conditional (ansible_distribution in __network_rh_distros): True 28023 1726853633.01202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.01227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.01250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.01292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.01307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.01349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.01376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.01400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.01437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.01451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.01492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.01515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.01537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.01575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.01593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.01875: variable 'network_connections' from source: task vars 28023 1726853633.01895: variable 'interface1' from source: play vars 28023 1726853633.01947: variable 'interface1' from source: play vars 28023 1726853633.02000: variable 'interface1_mac' from source: set_fact 28023 1726853633.02019: variable 'network_state' from source: role '' defaults 28023 1726853633.02063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853633.02178: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853633.02205: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853633.02227: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853633.02251: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853633.02284: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853633.02302: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853633.02319: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.02336: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853633.02367: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 28023 1726853633.02372: when evaluation is False, skipping this task 28023 1726853633.02375: _execute() done 28023 1726853633.02377: dumping result to json 28023 1726853633.02379: done dumping result, returning 28023 1726853633.02387: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-fdb6-dad7-000000000070] 28023 1726853633.02390: sending task result for task 02083763-bbaf-fdb6-dad7-000000000070 28023 1726853633.02483: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000070 28023 1726853633.02486: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 28023 1726853633.02529: no more pending results, returning what we have 28023 1726853633.02532: results queue empty 28023 1726853633.02532: checking for any_errors_fatal 28023 1726853633.02542: done checking for any_errors_fatal 28023 1726853633.02542: checking for max_fail_percentage 28023 1726853633.02544: done checking for max_fail_percentage 28023 1726853633.02545: checking to see if all hosts have failed and the running result is not ok 28023 1726853633.02546: done checking to see if all hosts have failed 28023 1726853633.02547: getting the remaining hosts for this loop 28023 1726853633.02549: done getting the remaining hosts for this loop 28023 1726853633.02552: getting the next task for host managed_node3 28023 1726853633.02568: done getting next task for host managed_node3 28023 1726853633.02573: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28023 1726853633.02576: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853633.02592: getting variables 28023 1726853633.02593: in VariableManager get_vars() 28023 1726853633.02633: Calling all_inventory to load vars for managed_node3 28023 1726853633.02637: Calling groups_inventory to load vars for managed_node3 28023 1726853633.02639: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853633.02649: Calling all_plugins_play to load vars for managed_node3 28023 1726853633.02651: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853633.02653: Calling groups_plugins_play to load vars for managed_node3 28023 1726853633.03466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853633.04328: done with get_vars() 28023 1726853633.04345: done getting variables 28023 1726853633.04391: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:33:53 -0400 (0:00:00.074) 0:00:25.128 ****** 28023 1726853633.04416: entering _queue_task() for managed_node3/dnf 28023 1726853633.04655: worker is 1 (out of 1 available) 28023 1726853633.04674: exiting _queue_task() for managed_node3/dnf 28023 1726853633.04686: done queuing things up, now waiting for results queue to drain 28023 1726853633.04688: waiting for pending results... 28023 1726853633.04860: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28023 1726853633.04947: in run() - task 02083763-bbaf-fdb6-dad7-000000000071 28023 1726853633.04962: variable 'ansible_search_path' from source: unknown 28023 1726853633.04966: variable 'ansible_search_path' from source: unknown 28023 1726853633.04994: calling self._execute() 28023 1726853633.05074: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853633.05078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853633.05087: variable 'omit' from source: magic vars 28023 1726853633.05353: variable 'ansible_distribution_major_version' from source: facts 28023 1726853633.05363: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853633.05499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853633.11084: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853633.11124: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853633.11152: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853633.11177: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853633.11196: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853633.11266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.11287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.11304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.11329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.11340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.11419: variable 'ansible_distribution' from source: facts 28023 1726853633.11422: variable 'ansible_distribution_major_version' from source: facts 28023 1726853633.11433: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28023 1726853633.11515: variable '__network_wireless_connections_defined' from source: role '' defaults 28023 1726853633.11598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.11614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.11631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.11655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.11666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.11698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.11714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.11729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.11752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.11763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.11791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.11809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.11825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.11849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.11861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.11956: variable 'network_connections' from source: task vars 28023 1726853633.11963: variable 'interface1' from source: play vars 28023 1726853633.12010: variable 'interface1' from source: play vars 28023 1726853633.12063: variable 'interface1_mac' from source: set_fact 28023 1726853633.12121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853633.12220: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853633.12249: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853633.12273: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853633.12293: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853633.12321: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853633.12336: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853633.12362: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.12381: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853633.12443: variable '__network_team_connections_defined' from source: role '' defaults 28023 1726853633.12693: variable 'network_connections' from source: task vars 28023 1726853633.12705: variable 'interface1' from source: play vars 28023 1726853633.12769: variable 'interface1' from source: play vars 28023 1726853633.12859: variable 'interface1_mac' from source: set_fact 28023 1726853633.12907: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28023 1726853633.12918: when evaluation is False, skipping this task 28023 1726853633.12924: _execute() done 28023 1726853633.12930: dumping result to json 28023 1726853633.12935: done dumping result, returning 28023 1726853633.12945: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-fdb6-dad7-000000000071] 28023 1726853633.12953: sending task result for task 02083763-bbaf-fdb6-dad7-000000000071 28023 1726853633.13176: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000071 28023 1726853633.13180: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28023 1726853633.13230: no more pending results, returning what we have 28023 1726853633.13233: results queue empty 28023 1726853633.13234: checking for any_errors_fatal 28023 1726853633.13243: done checking for any_errors_fatal 28023 1726853633.13244: checking for max_fail_percentage 28023 1726853633.13245: done checking for max_fail_percentage 28023 1726853633.13247: checking to see if all hosts have failed and the running result is not ok 28023 1726853633.13248: done checking to see if all hosts have failed 28023 1726853633.13248: getting the remaining hosts for this loop 28023 1726853633.13250: done getting the remaining hosts for this loop 28023 1726853633.13254: getting the next task for host managed_node3 28023 1726853633.13264: done getting next task for host managed_node3 28023 1726853633.13268: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28023 1726853633.13358: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853633.13378: getting variables 28023 1726853633.13380: in VariableManager get_vars() 28023 1726853633.13419: Calling all_inventory to load vars for managed_node3 28023 1726853633.13422: Calling groups_inventory to load vars for managed_node3 28023 1726853633.13424: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853633.13434: Calling all_plugins_play to load vars for managed_node3 28023 1726853633.13437: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853633.13440: Calling groups_plugins_play to load vars for managed_node3 28023 1726853633.20547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853633.24502: done with get_vars() 28023 1726853633.24535: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28023 1726853633.24599: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:33:53 -0400 (0:00:00.202) 0:00:25.330 ****** 28023 1726853633.24627: entering _queue_task() for managed_node3/yum 28023 1726853633.25580: worker is 1 (out of 1 available) 28023 1726853633.25590: exiting _queue_task() for managed_node3/yum 28023 1726853633.25600: done queuing things up, now waiting for results queue to drain 28023 1726853633.25601: waiting for pending results... 28023 1726853633.26090: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28023 1726853633.26188: in run() - task 02083763-bbaf-fdb6-dad7-000000000072 28023 1726853633.26210: variable 'ansible_search_path' from source: unknown 28023 1726853633.26403: variable 'ansible_search_path' from source: unknown 28023 1726853633.26408: calling self._execute() 28023 1726853633.26547: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853633.26561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853633.26578: variable 'omit' from source: magic vars 28023 1726853633.27203: variable 'ansible_distribution_major_version' from source: facts 28023 1726853633.27221: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853633.27561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853633.30832: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853633.30916: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853633.30957: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853633.31005: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853633.31036: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853633.31121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.31154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.31191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.31239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.31257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.31360: variable 'ansible_distribution_major_version' from source: facts 28023 1726853633.31383: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28023 1726853633.31394: when evaluation is False, skipping this task 28023 1726853633.31401: _execute() done 28023 1726853633.31429: dumping result to json 28023 1726853633.31432: done dumping result, returning 28023 1726853633.31436: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-fdb6-dad7-000000000072] 28023 1726853633.31438: sending task result for task 02083763-bbaf-fdb6-dad7-000000000072 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28023 1726853633.31740: no more pending results, returning what we have 28023 1726853633.31743: results queue empty 28023 1726853633.31744: checking for any_errors_fatal 28023 1726853633.31753: done checking for any_errors_fatal 28023 1726853633.31753: checking for max_fail_percentage 28023 1726853633.31755: done checking for max_fail_percentage 28023 1726853633.31756: checking to see if all hosts have failed and the running result is not ok 28023 1726853633.31757: done checking to see if all hosts have failed 28023 1726853633.31758: getting the remaining hosts for this loop 28023 1726853633.31760: done getting the remaining hosts for this loop 28023 1726853633.31763: getting the next task for host managed_node3 28023 1726853633.31770: done getting next task for host managed_node3 28023 1726853633.31776: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28023 1726853633.31779: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853633.31799: getting variables 28023 1726853633.31801: in VariableManager get_vars() 28023 1726853633.31843: Calling all_inventory to load vars for managed_node3 28023 1726853633.31846: Calling groups_inventory to load vars for managed_node3 28023 1726853633.31849: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853633.31861: Calling all_plugins_play to load vars for managed_node3 28023 1726853633.31864: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853633.31867: Calling groups_plugins_play to load vars for managed_node3 28023 1726853633.31992: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000072 28023 1726853633.31995: WORKER PROCESS EXITING 28023 1726853633.33421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853633.35074: done with get_vars() 28023 1726853633.35096: done getting variables 28023 1726853633.35154: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:33:53 -0400 (0:00:00.105) 0:00:25.435 ****** 28023 1726853633.35188: entering _queue_task() for managed_node3/fail 28023 1726853633.35581: worker is 1 (out of 1 available) 28023 1726853633.35593: exiting _queue_task() for managed_node3/fail 28023 1726853633.35602: done queuing things up, now waiting for results queue to drain 28023 1726853633.35603: waiting for pending results... 28023 1726853633.36041: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28023 1726853633.36377: in run() - task 02083763-bbaf-fdb6-dad7-000000000073 28023 1726853633.36421: variable 'ansible_search_path' from source: unknown 28023 1726853633.36424: variable 'ansible_search_path' from source: unknown 28023 1726853633.36440: calling self._execute() 28023 1726853633.36748: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853633.36752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853633.36754: variable 'omit' from source: magic vars 28023 1726853633.37515: variable 'ansible_distribution_major_version' from source: facts 28023 1726853633.37795: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853633.37831: variable '__network_wireless_connections_defined' from source: role '' defaults 28023 1726853633.38206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853633.42897: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853633.42968: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853633.43041: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853633.43330: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853633.43334: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853633.43381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.43417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.43577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.43620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.43638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.43789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.43822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.43854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.43902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.44019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.44128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.44132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.44237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.44283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.44301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.44779: variable 'network_connections' from source: task vars 28023 1726853633.44783: variable 'interface1' from source: play vars 28023 1726853633.44836: variable 'interface1' from source: play vars 28023 1726853633.44961: variable 'interface1_mac' from source: set_fact 28023 1726853633.45222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853633.45477: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853633.45570: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853633.45693: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853633.45789: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853633.46178: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853633.46182: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853633.46184: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.46306: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853633.46375: variable '__network_team_connections_defined' from source: role '' defaults 28023 1726853633.47204: variable 'network_connections' from source: task vars 28023 1726853633.47215: variable 'interface1' from source: play vars 28023 1726853633.47330: variable 'interface1' from source: play vars 28023 1726853633.47557: variable 'interface1_mac' from source: set_fact 28023 1726853633.47832: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28023 1726853633.47845: when evaluation is False, skipping this task 28023 1726853633.47854: _execute() done 28023 1726853633.47862: dumping result to json 28023 1726853633.47873: done dumping result, returning 28023 1726853633.48030: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-fdb6-dad7-000000000073] 28023 1726853633.48043: sending task result for task 02083763-bbaf-fdb6-dad7-000000000073 28023 1726853633.48119: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000073 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28023 1726853633.48192: no more pending results, returning what we have 28023 1726853633.48196: results queue empty 28023 1726853633.48197: checking for any_errors_fatal 28023 1726853633.48202: done checking for any_errors_fatal 28023 1726853633.48203: checking for max_fail_percentage 28023 1726853633.48205: done checking for max_fail_percentage 28023 1726853633.48206: checking to see if all hosts have failed and the running result is not ok 28023 1726853633.48207: done checking to see if all hosts have failed 28023 1726853633.48208: getting the remaining hosts for this loop 28023 1726853633.48209: done getting the remaining hosts for this loop 28023 1726853633.48213: getting the next task for host managed_node3 28023 1726853633.48221: done getting next task for host managed_node3 28023 1726853633.48225: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28023 1726853633.48228: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853633.48248: getting variables 28023 1726853633.48250: in VariableManager get_vars() 28023 1726853633.48297: Calling all_inventory to load vars for managed_node3 28023 1726853633.48301: Calling groups_inventory to load vars for managed_node3 28023 1726853633.48304: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853633.48317: Calling all_plugins_play to load vars for managed_node3 28023 1726853633.48321: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853633.48324: Calling groups_plugins_play to load vars for managed_node3 28023 1726853633.49284: WORKER PROCESS EXITING 28023 1726853633.50581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853633.52153: done with get_vars() 28023 1726853633.52176: done getting variables 28023 1726853633.52232: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:33:53 -0400 (0:00:00.170) 0:00:25.606 ****** 28023 1726853633.52266: entering _queue_task() for managed_node3/package 28023 1726853633.52799: worker is 1 (out of 1 available) 28023 1726853633.52806: exiting _queue_task() for managed_node3/package 28023 1726853633.52817: done queuing things up, now waiting for results queue to drain 28023 1726853633.52818: waiting for pending results... 28023 1726853633.52865: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 28023 1726853633.53009: in run() - task 02083763-bbaf-fdb6-dad7-000000000074 28023 1726853633.53028: variable 'ansible_search_path' from source: unknown 28023 1726853633.53039: variable 'ansible_search_path' from source: unknown 28023 1726853633.53154: calling self._execute() 28023 1726853633.53187: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853633.53198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853633.53213: variable 'omit' from source: magic vars 28023 1726853633.53569: variable 'ansible_distribution_major_version' from source: facts 28023 1726853633.53592: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853633.53779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853633.54054: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853633.54105: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853633.54146: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853633.54221: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853633.54339: variable 'network_packages' from source: role '' defaults 28023 1726853633.54449: variable '__network_provider_setup' from source: role '' defaults 28023 1726853633.54578: variable '__network_service_name_default_nm' from source: role '' defaults 28023 1726853633.54581: variable '__network_service_name_default_nm' from source: role '' defaults 28023 1726853633.54583: variable '__network_packages_default_nm' from source: role '' defaults 28023 1726853633.54613: variable '__network_packages_default_nm' from source: role '' defaults 28023 1726853633.54796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853633.57868: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853633.57937: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853633.58176: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853633.58179: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853633.58197: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853633.58367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.58441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.58547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.58846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.58849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.58852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.58854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.58856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.58957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.58980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.59458: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28023 1726853633.59632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.59698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.59727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.59774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.59808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.60277: variable 'ansible_python' from source: facts 28023 1726853633.60285: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28023 1726853633.60311: variable '__network_wpa_supplicant_required' from source: role '' defaults 28023 1726853633.60401: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28023 1726853633.60533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.60563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.60593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.60640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.60659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.60708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.60750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.60784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.60831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.60849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.61002: variable 'network_connections' from source: task vars 28023 1726853633.61045: variable 'interface1' from source: play vars 28023 1726853633.61127: variable 'interface1' from source: play vars 28023 1726853633.61551: variable 'interface1_mac' from source: set_fact 28023 1726853633.61605: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853633.61634: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853633.61678: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.61712: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853633.61766: variable '__network_wireless_connections_defined' from source: role '' defaults 28023 1726853633.62076: variable 'network_connections' from source: task vars 28023 1726853633.62090: variable 'interface1' from source: play vars 28023 1726853633.62194: variable 'interface1' from source: play vars 28023 1726853633.62323: variable 'interface1_mac' from source: set_fact 28023 1726853633.62394: variable '__network_packages_default_wireless' from source: role '' defaults 28023 1726853633.62486: variable '__network_wireless_connections_defined' from source: role '' defaults 28023 1726853633.62915: variable 'network_connections' from source: task vars 28023 1726853633.62965: variable 'interface1' from source: play vars 28023 1726853633.62999: variable 'interface1' from source: play vars 28023 1726853633.63094: variable 'interface1_mac' from source: set_fact 28023 1726853633.63126: variable '__network_packages_default_team' from source: role '' defaults 28023 1726853633.63213: variable '__network_team_connections_defined' from source: role '' defaults 28023 1726853633.63533: variable 'network_connections' from source: task vars 28023 1726853633.63542: variable 'interface1' from source: play vars 28023 1726853633.63721: variable 'interface1' from source: play vars 28023 1726853633.63724: variable 'interface1_mac' from source: set_fact 28023 1726853633.63773: variable '__network_service_name_default_initscripts' from source: role '' defaults 28023 1726853633.63841: variable '__network_service_name_default_initscripts' from source: role '' defaults 28023 1726853633.63852: variable '__network_packages_default_initscripts' from source: role '' defaults 28023 1726853633.63919: variable '__network_packages_default_initscripts' from source: role '' defaults 28023 1726853633.64154: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28023 1726853633.64842: variable 'network_connections' from source: task vars 28023 1726853633.64852: variable 'interface1' from source: play vars 28023 1726853633.64926: variable 'interface1' from source: play vars 28023 1726853633.65052: variable 'interface1_mac' from source: set_fact 28023 1726853633.65056: variable 'ansible_distribution' from source: facts 28023 1726853633.65073: variable '__network_rh_distros' from source: role '' defaults 28023 1726853633.65105: variable 'ansible_distribution_major_version' from source: facts 28023 1726853633.65169: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28023 1726853633.65783: variable 'ansible_distribution' from source: facts 28023 1726853633.65787: variable '__network_rh_distros' from source: role '' defaults 28023 1726853633.65789: variable 'ansible_distribution_major_version' from source: facts 28023 1726853633.65791: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28023 1726853633.65881: variable 'ansible_distribution' from source: facts 28023 1726853633.65926: variable '__network_rh_distros' from source: role '' defaults 28023 1726853633.65937: variable 'ansible_distribution_major_version' from source: facts 28023 1726853633.65984: variable 'network_provider' from source: set_fact 28023 1726853633.66011: variable 'ansible_facts' from source: unknown 28023 1726853633.66743: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28023 1726853633.66755: when evaluation is False, skipping this task 28023 1726853633.66766: _execute() done 28023 1726853633.66778: dumping result to json 28023 1726853633.66786: done dumping result, returning 28023 1726853633.66798: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-fdb6-dad7-000000000074] 28023 1726853633.66806: sending task result for task 02083763-bbaf-fdb6-dad7-000000000074 skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28023 1726853633.67049: no more pending results, returning what we have 28023 1726853633.67053: results queue empty 28023 1726853633.67053: checking for any_errors_fatal 28023 1726853633.67066: done checking for any_errors_fatal 28023 1726853633.67066: checking for max_fail_percentage 28023 1726853633.67068: done checking for max_fail_percentage 28023 1726853633.67070: checking to see if all hosts have failed and the running result is not ok 28023 1726853633.67072: done checking to see if all hosts have failed 28023 1726853633.67073: getting the remaining hosts for this loop 28023 1726853633.67075: done getting the remaining hosts for this loop 28023 1726853633.67174: getting the next task for host managed_node3 28023 1726853633.67183: done getting next task for host managed_node3 28023 1726853633.67191: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28023 1726853633.67195: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853633.67205: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000074 28023 1726853633.67208: WORKER PROCESS EXITING 28023 1726853633.67220: getting variables 28023 1726853633.67221: in VariableManager get_vars() 28023 1726853633.67267: Calling all_inventory to load vars for managed_node3 28023 1726853633.67425: Calling groups_inventory to load vars for managed_node3 28023 1726853633.67429: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853633.67445: Calling all_plugins_play to load vars for managed_node3 28023 1726853633.67448: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853633.67451: Calling groups_plugins_play to load vars for managed_node3 28023 1726853633.68964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853633.71895: done with get_vars() 28023 1726853633.71922: done getting variables 28023 1726853633.71992: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:33:53 -0400 (0:00:00.197) 0:00:25.804 ****** 28023 1726853633.72029: entering _queue_task() for managed_node3/package 28023 1726853633.72581: worker is 1 (out of 1 available) 28023 1726853633.72593: exiting _queue_task() for managed_node3/package 28023 1726853633.72604: done queuing things up, now waiting for results queue to drain 28023 1726853633.72605: waiting for pending results... 28023 1726853633.72804: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28023 1726853633.72952: in run() - task 02083763-bbaf-fdb6-dad7-000000000075 28023 1726853633.72978: variable 'ansible_search_path' from source: unknown 28023 1726853633.72985: variable 'ansible_search_path' from source: unknown 28023 1726853633.73029: calling self._execute() 28023 1726853633.73164: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853633.73179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853633.73218: variable 'omit' from source: magic vars 28023 1726853633.73597: variable 'ansible_distribution_major_version' from source: facts 28023 1726853633.73615: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853633.73746: variable 'network_state' from source: role '' defaults 28023 1726853633.73819: Evaluated conditional (network_state != {}): False 28023 1726853633.73822: when evaluation is False, skipping this task 28023 1726853633.73824: _execute() done 28023 1726853633.73826: dumping result to json 28023 1726853633.73829: done dumping result, returning 28023 1726853633.73831: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-fdb6-dad7-000000000075] 28023 1726853633.73835: sending task result for task 02083763-bbaf-fdb6-dad7-000000000075 28023 1726853633.74023: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000075 28023 1726853633.74027: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28023 1726853633.74086: no more pending results, returning what we have 28023 1726853633.74090: results queue empty 28023 1726853633.74091: checking for any_errors_fatal 28023 1726853633.74097: done checking for any_errors_fatal 28023 1726853633.74097: checking for max_fail_percentage 28023 1726853633.74099: done checking for max_fail_percentage 28023 1726853633.74100: checking to see if all hosts have failed and the running result is not ok 28023 1726853633.74101: done checking to see if all hosts have failed 28023 1726853633.74102: getting the remaining hosts for this loop 28023 1726853633.74104: done getting the remaining hosts for this loop 28023 1726853633.74108: getting the next task for host managed_node3 28023 1726853633.74116: done getting next task for host managed_node3 28023 1726853633.74120: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28023 1726853633.74123: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853633.74143: getting variables 28023 1726853633.74145: in VariableManager get_vars() 28023 1726853633.74310: Calling all_inventory to load vars for managed_node3 28023 1726853633.74416: Calling groups_inventory to load vars for managed_node3 28023 1726853633.74419: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853633.74428: Calling all_plugins_play to load vars for managed_node3 28023 1726853633.74431: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853633.74434: Calling groups_plugins_play to load vars for managed_node3 28023 1726853633.76191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853633.77827: done with get_vars() 28023 1726853633.77852: done getting variables 28023 1726853633.77922: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:33:53 -0400 (0:00:00.059) 0:00:25.863 ****** 28023 1726853633.77959: entering _queue_task() for managed_node3/package 28023 1726853633.78311: worker is 1 (out of 1 available) 28023 1726853633.78437: exiting _queue_task() for managed_node3/package 28023 1726853633.78450: done queuing things up, now waiting for results queue to drain 28023 1726853633.78451: waiting for pending results... 28023 1726853633.78642: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28023 1726853633.78807: in run() - task 02083763-bbaf-fdb6-dad7-000000000076 28023 1726853633.78855: variable 'ansible_search_path' from source: unknown 28023 1726853633.78858: variable 'ansible_search_path' from source: unknown 28023 1726853633.78869: calling self._execute() 28023 1726853633.78960: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853633.78965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853633.78970: variable 'omit' from source: magic vars 28023 1726853633.79249: variable 'ansible_distribution_major_version' from source: facts 28023 1726853633.79262: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853633.79342: variable 'network_state' from source: role '' defaults 28023 1726853633.79351: Evaluated conditional (network_state != {}): False 28023 1726853633.79354: when evaluation is False, skipping this task 28023 1726853633.79361: _execute() done 28023 1726853633.79364: dumping result to json 28023 1726853633.79367: done dumping result, returning 28023 1726853633.79370: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-fdb6-dad7-000000000076] 28023 1726853633.79378: sending task result for task 02083763-bbaf-fdb6-dad7-000000000076 28023 1726853633.79466: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000076 28023 1726853633.79469: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28023 1726853633.79518: no more pending results, returning what we have 28023 1726853633.79521: results queue empty 28023 1726853633.79521: checking for any_errors_fatal 28023 1726853633.79533: done checking for any_errors_fatal 28023 1726853633.79534: checking for max_fail_percentage 28023 1726853633.79535: done checking for max_fail_percentage 28023 1726853633.79536: checking to see if all hosts have failed and the running result is not ok 28023 1726853633.79537: done checking to see if all hosts have failed 28023 1726853633.79538: getting the remaining hosts for this loop 28023 1726853633.79539: done getting the remaining hosts for this loop 28023 1726853633.79542: getting the next task for host managed_node3 28023 1726853633.79548: done getting next task for host managed_node3 28023 1726853633.79552: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28023 1726853633.79554: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853633.79575: getting variables 28023 1726853633.79577: in VariableManager get_vars() 28023 1726853633.79611: Calling all_inventory to load vars for managed_node3 28023 1726853633.79613: Calling groups_inventory to load vars for managed_node3 28023 1726853633.79615: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853633.79625: Calling all_plugins_play to load vars for managed_node3 28023 1726853633.79627: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853633.79630: Calling groups_plugins_play to load vars for managed_node3 28023 1726853633.80521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853633.81910: done with get_vars() 28023 1726853633.81926: done getting variables 28023 1726853633.81974: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:33:53 -0400 (0:00:00.040) 0:00:25.903 ****** 28023 1726853633.81997: entering _queue_task() for managed_node3/service 28023 1726853633.82231: worker is 1 (out of 1 available) 28023 1726853633.82245: exiting _queue_task() for managed_node3/service 28023 1726853633.82260: done queuing things up, now waiting for results queue to drain 28023 1726853633.82262: waiting for pending results... 28023 1726853633.82441: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28023 1726853633.82532: in run() - task 02083763-bbaf-fdb6-dad7-000000000077 28023 1726853633.82544: variable 'ansible_search_path' from source: unknown 28023 1726853633.82547: variable 'ansible_search_path' from source: unknown 28023 1726853633.82579: calling self._execute() 28023 1726853633.82707: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853633.82712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853633.82715: variable 'omit' from source: magic vars 28023 1726853633.82947: variable 'ansible_distribution_major_version' from source: facts 28023 1726853633.82960: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853633.83047: variable '__network_wireless_connections_defined' from source: role '' defaults 28023 1726853633.83187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853633.85280: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853633.85285: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853633.85328: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853633.85342: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853633.85345: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853633.85390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.85410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.85426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.85464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.85482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.85515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.85532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.85549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.85583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.85593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.85621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.85637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.85653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.85684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.85695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.85804: variable 'network_connections' from source: task vars 28023 1726853633.85816: variable 'interface1' from source: play vars 28023 1726853633.85869: variable 'interface1' from source: play vars 28023 1726853633.85927: variable 'interface1_mac' from source: set_fact 28023 1726853633.85982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853633.86100: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853633.86128: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853633.86150: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853633.86172: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853633.86200: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853633.86221: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853633.86237: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.86254: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853633.86301: variable '__network_team_connections_defined' from source: role '' defaults 28023 1726853633.86448: variable 'network_connections' from source: task vars 28023 1726853633.86452: variable 'interface1' from source: play vars 28023 1726853633.86497: variable 'interface1' from source: play vars 28023 1726853633.86643: variable 'interface1_mac' from source: set_fact 28023 1726853633.86647: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28023 1726853633.86649: when evaluation is False, skipping this task 28023 1726853633.86650: _execute() done 28023 1726853633.86652: dumping result to json 28023 1726853633.86653: done dumping result, returning 28023 1726853633.86655: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-fdb6-dad7-000000000077] 28023 1726853633.86667: sending task result for task 02083763-bbaf-fdb6-dad7-000000000077 28023 1726853633.86726: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000077 28023 1726853633.86729: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28023 1726853633.86791: no more pending results, returning what we have 28023 1726853633.86793: results queue empty 28023 1726853633.86794: checking for any_errors_fatal 28023 1726853633.86799: done checking for any_errors_fatal 28023 1726853633.86800: checking for max_fail_percentage 28023 1726853633.86802: done checking for max_fail_percentage 28023 1726853633.86802: checking to see if all hosts have failed and the running result is not ok 28023 1726853633.86804: done checking to see if all hosts have failed 28023 1726853633.86804: getting the remaining hosts for this loop 28023 1726853633.86806: done getting the remaining hosts for this loop 28023 1726853633.86809: getting the next task for host managed_node3 28023 1726853633.86814: done getting next task for host managed_node3 28023 1726853633.86817: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28023 1726853633.86820: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853633.86834: getting variables 28023 1726853633.86835: in VariableManager get_vars() 28023 1726853633.86877: Calling all_inventory to load vars for managed_node3 28023 1726853633.86880: Calling groups_inventory to load vars for managed_node3 28023 1726853633.86882: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853633.86892: Calling all_plugins_play to load vars for managed_node3 28023 1726853633.86896: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853633.86899: Calling groups_plugins_play to load vars for managed_node3 28023 1726853633.88224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853633.89597: done with get_vars() 28023 1726853633.89614: done getting variables 28023 1726853633.89654: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:33:53 -0400 (0:00:00.076) 0:00:25.980 ****** 28023 1726853633.89682: entering _queue_task() for managed_node3/service 28023 1726853633.89910: worker is 1 (out of 1 available) 28023 1726853633.89922: exiting _queue_task() for managed_node3/service 28023 1726853633.89934: done queuing things up, now waiting for results queue to drain 28023 1726853633.89935: waiting for pending results... 28023 1726853633.90107: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28023 1726853633.90197: in run() - task 02083763-bbaf-fdb6-dad7-000000000078 28023 1726853633.90209: variable 'ansible_search_path' from source: unknown 28023 1726853633.90213: variable 'ansible_search_path' from source: unknown 28023 1726853633.90242: calling self._execute() 28023 1726853633.90320: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853633.90324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853633.90333: variable 'omit' from source: magic vars 28023 1726853633.90605: variable 'ansible_distribution_major_version' from source: facts 28023 1726853633.90612: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853633.90718: variable 'network_provider' from source: set_fact 28023 1726853633.90722: variable 'network_state' from source: role '' defaults 28023 1726853633.90730: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28023 1726853633.90735: variable 'omit' from source: magic vars 28023 1726853633.90770: variable 'omit' from source: magic vars 28023 1726853633.90792: variable 'network_service_name' from source: role '' defaults 28023 1726853633.90845: variable 'network_service_name' from source: role '' defaults 28023 1726853633.90917: variable '__network_provider_setup' from source: role '' defaults 28023 1726853633.90922: variable '__network_service_name_default_nm' from source: role '' defaults 28023 1726853633.90968: variable '__network_service_name_default_nm' from source: role '' defaults 28023 1726853633.90977: variable '__network_packages_default_nm' from source: role '' defaults 28023 1726853633.91021: variable '__network_packages_default_nm' from source: role '' defaults 28023 1726853633.91168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853633.93176: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853633.93262: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853633.93315: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853633.93353: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853633.93477: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853633.93483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.93522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.93552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.93606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.93626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.93675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.93712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.93742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.93786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.93807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.94127: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28023 1726853633.94214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.94234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.94252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.94281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.94291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.94353: variable 'ansible_python' from source: facts 28023 1726853633.94375: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28023 1726853633.94428: variable '__network_wpa_supplicant_required' from source: role '' defaults 28023 1726853633.94486: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28023 1726853633.94567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.94587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.94604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.94627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.94638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.94675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853633.94695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853633.94712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.94735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853633.94745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853633.94837: variable 'network_connections' from source: task vars 28023 1726853633.94843: variable 'interface1' from source: play vars 28023 1726853633.94900: variable 'interface1' from source: play vars 28023 1726853633.94960: variable 'interface1_mac' from source: set_fact 28023 1726853633.95044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853633.95175: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853633.95211: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853633.95241: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853633.95273: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853633.95319: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853633.95340: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853633.95366: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853633.95390: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853633.95426: variable '__network_wireless_connections_defined' from source: role '' defaults 28023 1726853633.95600: variable 'network_connections' from source: task vars 28023 1726853633.95605: variable 'interface1' from source: play vars 28023 1726853633.95657: variable 'interface1' from source: play vars 28023 1726853633.95721: variable 'interface1_mac' from source: set_fact 28023 1726853633.95765: variable '__network_packages_default_wireless' from source: role '' defaults 28023 1726853633.95819: variable '__network_wireless_connections_defined' from source: role '' defaults 28023 1726853633.96006: variable 'network_connections' from source: task vars 28023 1726853633.96015: variable 'interface1' from source: play vars 28023 1726853633.96276: variable 'interface1' from source: play vars 28023 1726853633.96279: variable 'interface1_mac' from source: set_fact 28023 1726853633.96281: variable '__network_packages_default_team' from source: role '' defaults 28023 1726853633.96283: variable '__network_team_connections_defined' from source: role '' defaults 28023 1726853633.96565: variable 'network_connections' from source: task vars 28023 1726853633.96582: variable 'interface1' from source: play vars 28023 1726853633.96642: variable 'interface1' from source: play vars 28023 1726853633.96737: variable 'interface1_mac' from source: set_fact 28023 1726853633.96810: variable '__network_service_name_default_initscripts' from source: role '' defaults 28023 1726853633.96876: variable '__network_service_name_default_initscripts' from source: role '' defaults 28023 1726853633.96888: variable '__network_packages_default_initscripts' from source: role '' defaults 28023 1726853633.96943: variable '__network_packages_default_initscripts' from source: role '' defaults 28023 1726853633.97150: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28023 1726853633.97645: variable 'network_connections' from source: task vars 28023 1726853633.97662: variable 'interface1' from source: play vars 28023 1726853633.97730: variable 'interface1' from source: play vars 28023 1726853633.97807: variable 'interface1_mac' from source: set_fact 28023 1726853633.97825: variable 'ansible_distribution' from source: facts 28023 1726853633.97832: variable '__network_rh_distros' from source: role '' defaults 28023 1726853633.97843: variable 'ansible_distribution_major_version' from source: facts 28023 1726853633.97874: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28023 1726853633.98051: variable 'ansible_distribution' from source: facts 28023 1726853633.98063: variable '__network_rh_distros' from source: role '' defaults 28023 1726853633.98075: variable 'ansible_distribution_major_version' from source: facts 28023 1726853633.98093: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28023 1726853633.98268: variable 'ansible_distribution' from source: facts 28023 1726853633.98279: variable '__network_rh_distros' from source: role '' defaults 28023 1726853633.98288: variable 'ansible_distribution_major_version' from source: facts 28023 1726853633.98325: variable 'network_provider' from source: set_fact 28023 1726853633.98353: variable 'omit' from source: magic vars 28023 1726853633.98390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853633.98419: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853633.98443: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853633.98467: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853633.98483: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853633.98514: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853633.98521: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853633.98528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853633.98624: Set connection var ansible_shell_type to sh 28023 1726853633.98637: Set connection var ansible_shell_executable to /bin/sh 28023 1726853633.98875: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853633.98878: Set connection var ansible_connection to ssh 28023 1726853633.98881: Set connection var ansible_pipelining to False 28023 1726853633.98883: Set connection var ansible_timeout to 10 28023 1726853633.98884: variable 'ansible_shell_executable' from source: unknown 28023 1726853633.98886: variable 'ansible_connection' from source: unknown 28023 1726853633.98888: variable 'ansible_module_compression' from source: unknown 28023 1726853633.98890: variable 'ansible_shell_type' from source: unknown 28023 1726853633.98892: variable 'ansible_shell_executable' from source: unknown 28023 1726853633.98894: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853633.98895: variable 'ansible_pipelining' from source: unknown 28023 1726853633.98898: variable 'ansible_timeout' from source: unknown 28023 1726853633.98899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853633.98907: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853633.98909: variable 'omit' from source: magic vars 28023 1726853633.98911: starting attempt loop 28023 1726853633.98913: running the handler 28023 1726853633.98945: variable 'ansible_facts' from source: unknown 28023 1726853633.99639: _low_level_execute_command(): starting 28023 1726853633.99651: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853634.00340: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853634.00353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853634.00373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853634.00392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853634.00409: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853634.00419: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853634.00432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853634.00453: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28023 1726853634.00469: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 28023 1726853634.00554: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853634.00582: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853634.00682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853634.02397: stdout chunk (state=3): >>>/root <<< 28023 1726853634.02534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853634.02545: stdout chunk (state=3): >>><<< 28023 1726853634.02560: stderr chunk (state=3): >>><<< 28023 1726853634.02587: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853634.02606: _low_level_execute_command(): starting 28023 1726853634.02618: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853634.025949-29241-211213043876174 `" && echo ansible-tmp-1726853634.025949-29241-211213043876174="` echo /root/.ansible/tmp/ansible-tmp-1726853634.025949-29241-211213043876174 `" ) && sleep 0' 28023 1726853634.03226: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853634.03241: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853634.03254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853634.03279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853634.03297: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853634.03391: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853634.03407: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853634.03425: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853634.03448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853634.03551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853634.05481: stdout chunk (state=3): >>>ansible-tmp-1726853634.025949-29241-211213043876174=/root/.ansible/tmp/ansible-tmp-1726853634.025949-29241-211213043876174 <<< 28023 1726853634.05603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853634.05613: stdout chunk (state=3): >>><<< 28023 1726853634.05625: stderr chunk (state=3): >>><<< 28023 1726853634.05638: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853634.025949-29241-211213043876174=/root/.ansible/tmp/ansible-tmp-1726853634.025949-29241-211213043876174 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853634.05666: variable 'ansible_module_compression' from source: unknown 28023 1726853634.05705: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 28023 1726853634.05763: variable 'ansible_facts' from source: unknown 28023 1726853634.05900: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853634.025949-29241-211213043876174/AnsiballZ_systemd.py 28023 1726853634.06003: Sending initial data 28023 1726853634.06007: Sent initial data (155 bytes) 28023 1726853634.06545: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853634.06549: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853634.06551: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853634.06554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853634.06669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853634.06675: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853634.06733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853634.08332: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28023 1726853634.08338: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853634.08392: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853634.08450: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpdp1zhzvv /root/.ansible/tmp/ansible-tmp-1726853634.025949-29241-211213043876174/AnsiballZ_systemd.py <<< 28023 1726853634.08457: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853634.025949-29241-211213043876174/AnsiballZ_systemd.py" <<< 28023 1726853634.08509: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpdp1zhzvv" to remote "/root/.ansible/tmp/ansible-tmp-1726853634.025949-29241-211213043876174/AnsiballZ_systemd.py" <<< 28023 1726853634.08511: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853634.025949-29241-211213043876174/AnsiballZ_systemd.py" <<< 28023 1726853634.09877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853634.09881: stderr chunk (state=3): >>><<< 28023 1726853634.09884: stdout chunk (state=3): >>><<< 28023 1726853634.09886: done transferring module to remote 28023 1726853634.09888: _low_level_execute_command(): starting 28023 1726853634.09890: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853634.025949-29241-211213043876174/ /root/.ansible/tmp/ansible-tmp-1726853634.025949-29241-211213043876174/AnsiballZ_systemd.py && sleep 0' 28023 1726853634.10447: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853634.10455: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853634.10466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853634.10488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853634.10504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853634.10510: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853634.10534: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853634.10537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853634.10598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853634.10609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853634.10680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853634.12548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853634.12576: stderr chunk (state=3): >>><<< 28023 1726853634.12578: stdout chunk (state=3): >>><<< 28023 1726853634.12590: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853634.12667: _low_level_execute_command(): starting 28023 1726853634.12670: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853634.025949-29241-211213043876174/AnsiballZ_systemd.py && sleep 0' 28023 1726853634.13144: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853634.13161: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853634.13185: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853634.13236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853634.13240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853634.13251: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853634.13346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853634.42744: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10719232", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3313754112", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1916779000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 28023 1726853634.42767: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysini<<< 28023 1726853634.42800: stdout chunk (state=3): >>>t.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28023 1726853634.44832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853634.44836: stdout chunk (state=3): >>><<< 28023 1726853634.44838: stderr chunk (state=3): >>><<< 28023 1726853634.44856: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10719232", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3313754112", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1916779000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysinit.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853634.45077: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853634.025949-29241-211213043876174/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853634.45137: _low_level_execute_command(): starting 28023 1726853634.45147: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853634.025949-29241-211213043876174/ > /dev/null 2>&1 && sleep 0' 28023 1726853634.45645: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853634.45648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853634.45677: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853634.45680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853634.45683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853634.45734: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853634.45738: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853634.45740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853634.45808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853634.47765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853634.47769: stdout chunk (state=3): >>><<< 28023 1726853634.47778: stderr chunk (state=3): >>><<< 28023 1726853634.47781: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853634.47791: handler run complete 28023 1726853634.47854: attempt loop complete, returning result 28023 1726853634.47858: _execute() done 28023 1726853634.47860: dumping result to json 28023 1726853634.47904: done dumping result, returning 28023 1726853634.47909: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-fdb6-dad7-000000000078] 28023 1726853634.47911: sending task result for task 02083763-bbaf-fdb6-dad7-000000000078 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28023 1726853634.48201: no more pending results, returning what we have 28023 1726853634.48204: results queue empty 28023 1726853634.48205: checking for any_errors_fatal 28023 1726853634.48213: done checking for any_errors_fatal 28023 1726853634.48214: checking for max_fail_percentage 28023 1726853634.48215: done checking for max_fail_percentage 28023 1726853634.48216: checking to see if all hosts have failed and the running result is not ok 28023 1726853634.48217: done checking to see if all hosts have failed 28023 1726853634.48218: getting the remaining hosts for this loop 28023 1726853634.48220: done getting the remaining hosts for this loop 28023 1726853634.48222: getting the next task for host managed_node3 28023 1726853634.48228: done getting next task for host managed_node3 28023 1726853634.48231: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28023 1726853634.48235: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853634.48244: getting variables 28023 1726853634.48246: in VariableManager get_vars() 28023 1726853634.48284: Calling all_inventory to load vars for managed_node3 28023 1726853634.48287: Calling groups_inventory to load vars for managed_node3 28023 1726853634.48289: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853634.48300: Calling all_plugins_play to load vars for managed_node3 28023 1726853634.48302: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853634.48304: Calling groups_plugins_play to load vars for managed_node3 28023 1726853634.48885: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000078 28023 1726853634.48889: WORKER PROCESS EXITING 28023 1726853634.49714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853634.52486: done with get_vars() 28023 1726853634.52519: done getting variables 28023 1726853634.52892: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:33:54 -0400 (0:00:00.632) 0:00:26.613 ****** 28023 1726853634.52928: entering _queue_task() for managed_node3/service 28023 1726853634.53588: worker is 1 (out of 1 available) 28023 1726853634.53599: exiting _queue_task() for managed_node3/service 28023 1726853634.53610: done queuing things up, now waiting for results queue to drain 28023 1726853634.53611: waiting for pending results... 28023 1726853634.53712: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28023 1726853634.53875: in run() - task 02083763-bbaf-fdb6-dad7-000000000079 28023 1726853634.53897: variable 'ansible_search_path' from source: unknown 28023 1726853634.53906: variable 'ansible_search_path' from source: unknown 28023 1726853634.53952: calling self._execute() 28023 1726853634.54070: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853634.54085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853634.54102: variable 'omit' from source: magic vars 28023 1726853634.54498: variable 'ansible_distribution_major_version' from source: facts 28023 1726853634.54516: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853634.54635: variable 'network_provider' from source: set_fact 28023 1726853634.54648: Evaluated conditional (network_provider == "nm"): True 28023 1726853634.54747: variable '__network_wpa_supplicant_required' from source: role '' defaults 28023 1726853634.54839: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28023 1726853634.55026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853634.57240: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853634.57319: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853634.57360: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853634.57402: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853634.57528: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853634.57536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853634.57573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853634.57604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853634.57653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853634.57676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853634.57729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853634.57764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853634.57795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853634.57836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853634.57857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853634.57906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853634.57935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853634.57987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853634.58012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853634.58032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853634.58186: variable 'network_connections' from source: task vars 28023 1726853634.58376: variable 'interface1' from source: play vars 28023 1726853634.58380: variable 'interface1' from source: play vars 28023 1726853634.58383: variable 'interface1_mac' from source: set_fact 28023 1726853634.58461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853634.58640: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853634.58683: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853634.58723: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853634.58755: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853634.58804: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853634.58837: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853634.58868: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853634.58900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853634.58958: variable '__network_wireless_connections_defined' from source: role '' defaults 28023 1726853634.59410: variable 'network_connections' from source: task vars 28023 1726853634.59421: variable 'interface1' from source: play vars 28023 1726853634.59490: variable 'interface1' from source: play vars 28023 1726853634.59563: variable 'interface1_mac' from source: set_fact 28023 1726853634.59622: Evaluated conditional (__network_wpa_supplicant_required): False 28023 1726853634.59630: when evaluation is False, skipping this task 28023 1726853634.59637: _execute() done 28023 1726853634.59644: dumping result to json 28023 1726853634.59651: done dumping result, returning 28023 1726853634.59694: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-fdb6-dad7-000000000079] 28023 1726853634.59697: sending task result for task 02083763-bbaf-fdb6-dad7-000000000079 28023 1726853634.60077: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000079 28023 1726853634.60081: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28023 1726853634.60126: no more pending results, returning what we have 28023 1726853634.60130: results queue empty 28023 1726853634.60131: checking for any_errors_fatal 28023 1726853634.60149: done checking for any_errors_fatal 28023 1726853634.60150: checking for max_fail_percentage 28023 1726853634.60152: done checking for max_fail_percentage 28023 1726853634.60153: checking to see if all hosts have failed and the running result is not ok 28023 1726853634.60154: done checking to see if all hosts have failed 28023 1726853634.60154: getting the remaining hosts for this loop 28023 1726853634.60156: done getting the remaining hosts for this loop 28023 1726853634.60160: getting the next task for host managed_node3 28023 1726853634.60166: done getting next task for host managed_node3 28023 1726853634.60169: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28023 1726853634.60174: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853634.60190: getting variables 28023 1726853634.60192: in VariableManager get_vars() 28023 1726853634.60235: Calling all_inventory to load vars for managed_node3 28023 1726853634.60239: Calling groups_inventory to load vars for managed_node3 28023 1726853634.60242: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853634.60253: Calling all_plugins_play to load vars for managed_node3 28023 1726853634.60256: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853634.60259: Calling groups_plugins_play to load vars for managed_node3 28023 1726853634.61760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853634.63278: done with get_vars() 28023 1726853634.63306: done getting variables 28023 1726853634.63365: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:33:54 -0400 (0:00:00.104) 0:00:26.717 ****** 28023 1726853634.63402: entering _queue_task() for managed_node3/service 28023 1726853634.63813: worker is 1 (out of 1 available) 28023 1726853634.63827: exiting _queue_task() for managed_node3/service 28023 1726853634.63840: done queuing things up, now waiting for results queue to drain 28023 1726853634.63841: waiting for pending results... 28023 1726853634.64098: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 28023 1726853634.64245: in run() - task 02083763-bbaf-fdb6-dad7-00000000007a 28023 1726853634.64269: variable 'ansible_search_path' from source: unknown 28023 1726853634.64280: variable 'ansible_search_path' from source: unknown 28023 1726853634.64328: calling self._execute() 28023 1726853634.64438: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853634.64452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853634.64468: variable 'omit' from source: magic vars 28023 1726853634.64860: variable 'ansible_distribution_major_version' from source: facts 28023 1726853634.64885: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853634.65008: variable 'network_provider' from source: set_fact 28023 1726853634.65019: Evaluated conditional (network_provider == "initscripts"): False 28023 1726853634.65087: when evaluation is False, skipping this task 28023 1726853634.65091: _execute() done 28023 1726853634.65093: dumping result to json 28023 1726853634.65095: done dumping result, returning 28023 1726853634.65098: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-fdb6-dad7-00000000007a] 28023 1726853634.65101: sending task result for task 02083763-bbaf-fdb6-dad7-00000000007a 28023 1726853634.65178: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000007a 28023 1726853634.65181: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28023 1726853634.65240: no more pending results, returning what we have 28023 1726853634.65243: results queue empty 28023 1726853634.65244: checking for any_errors_fatal 28023 1726853634.65258: done checking for any_errors_fatal 28023 1726853634.65259: checking for max_fail_percentage 28023 1726853634.65260: done checking for max_fail_percentage 28023 1726853634.65261: checking to see if all hosts have failed and the running result is not ok 28023 1726853634.65262: done checking to see if all hosts have failed 28023 1726853634.65263: getting the remaining hosts for this loop 28023 1726853634.65265: done getting the remaining hosts for this loop 28023 1726853634.65269: getting the next task for host managed_node3 28023 1726853634.65379: done getting next task for host managed_node3 28023 1726853634.65385: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28023 1726853634.65389: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853634.65411: getting variables 28023 1726853634.65413: in VariableManager get_vars() 28023 1726853634.65459: Calling all_inventory to load vars for managed_node3 28023 1726853634.65463: Calling groups_inventory to load vars for managed_node3 28023 1726853634.65465: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853634.65680: Calling all_plugins_play to load vars for managed_node3 28023 1726853634.65685: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853634.65689: Calling groups_plugins_play to load vars for managed_node3 28023 1726853634.67021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853634.68578: done with get_vars() 28023 1726853634.68607: done getting variables 28023 1726853634.68667: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:33:54 -0400 (0:00:00.053) 0:00:26.770 ****** 28023 1726853634.68706: entering _queue_task() for managed_node3/copy 28023 1726853634.69055: worker is 1 (out of 1 available) 28023 1726853634.69069: exiting _queue_task() for managed_node3/copy 28023 1726853634.69284: done queuing things up, now waiting for results queue to drain 28023 1726853634.69285: waiting for pending results... 28023 1726853634.69490: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28023 1726853634.69539: in run() - task 02083763-bbaf-fdb6-dad7-00000000007b 28023 1726853634.69560: variable 'ansible_search_path' from source: unknown 28023 1726853634.69569: variable 'ansible_search_path' from source: unknown 28023 1726853634.69615: calling self._execute() 28023 1726853634.69728: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853634.69744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853634.69759: variable 'omit' from source: magic vars 28023 1726853634.70174: variable 'ansible_distribution_major_version' from source: facts 28023 1726853634.70178: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853634.70279: variable 'network_provider' from source: set_fact 28023 1726853634.70291: Evaluated conditional (network_provider == "initscripts"): False 28023 1726853634.70298: when evaluation is False, skipping this task 28023 1726853634.70305: _execute() done 28023 1726853634.70311: dumping result to json 28023 1726853634.70318: done dumping result, returning 28023 1726853634.70378: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-fdb6-dad7-00000000007b] 28023 1726853634.70387: sending task result for task 02083763-bbaf-fdb6-dad7-00000000007b 28023 1726853634.70469: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000007b 28023 1726853634.70473: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28023 1726853634.70544: no more pending results, returning what we have 28023 1726853634.70548: results queue empty 28023 1726853634.70549: checking for any_errors_fatal 28023 1726853634.70555: done checking for any_errors_fatal 28023 1726853634.70555: checking for max_fail_percentage 28023 1726853634.70557: done checking for max_fail_percentage 28023 1726853634.70558: checking to see if all hosts have failed and the running result is not ok 28023 1726853634.70559: done checking to see if all hosts have failed 28023 1726853634.70560: getting the remaining hosts for this loop 28023 1726853634.70562: done getting the remaining hosts for this loop 28023 1726853634.70566: getting the next task for host managed_node3 28023 1726853634.70575: done getting next task for host managed_node3 28023 1726853634.70580: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28023 1726853634.70584: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853634.70605: getting variables 28023 1726853634.70607: in VariableManager get_vars() 28023 1726853634.70655: Calling all_inventory to load vars for managed_node3 28023 1726853634.70659: Calling groups_inventory to load vars for managed_node3 28023 1726853634.70662: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853634.70778: Calling all_plugins_play to load vars for managed_node3 28023 1726853634.70782: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853634.70786: Calling groups_plugins_play to load vars for managed_node3 28023 1726853634.72354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853634.73868: done with get_vars() 28023 1726853634.73900: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:33:54 -0400 (0:00:00.052) 0:00:26.823 ****** 28023 1726853634.73988: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 28023 1726853634.74358: worker is 1 (out of 1 available) 28023 1726853634.74574: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 28023 1726853634.74588: done queuing things up, now waiting for results queue to drain 28023 1726853634.74589: waiting for pending results... 28023 1726853634.74790: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28023 1726853634.74853: in run() - task 02083763-bbaf-fdb6-dad7-00000000007c 28023 1726853634.74876: variable 'ansible_search_path' from source: unknown 28023 1726853634.74884: variable 'ansible_search_path' from source: unknown 28023 1726853634.74976: calling self._execute() 28023 1726853634.75044: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853634.75056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853634.75069: variable 'omit' from source: magic vars 28023 1726853634.75468: variable 'ansible_distribution_major_version' from source: facts 28023 1726853634.75489: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853634.75500: variable 'omit' from source: magic vars 28023 1726853634.75558: variable 'omit' from source: magic vars 28023 1726853634.75732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853634.77956: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853634.77969: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853634.78014: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853634.78049: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853634.78087: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853634.78180: variable 'network_provider' from source: set_fact 28023 1726853634.78322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853634.78396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853634.78414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853634.78462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853634.78504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853634.78573: variable 'omit' from source: magic vars 28023 1726853634.78720: variable 'omit' from source: magic vars 28023 1726853634.78811: variable 'network_connections' from source: task vars 28023 1726853634.78833: variable 'interface1' from source: play vars 28023 1726853634.78910: variable 'interface1' from source: play vars 28023 1726853634.79047: variable 'interface1_mac' from source: set_fact 28023 1726853634.79183: variable 'omit' from source: magic vars 28023 1726853634.79196: variable '__lsr_ansible_managed' from source: task vars 28023 1726853634.79255: variable '__lsr_ansible_managed' from source: task vars 28023 1726853634.79535: Loaded config def from plugin (lookup/template) 28023 1726853634.79586: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28023 1726853634.79589: File lookup term: get_ansible_managed.j2 28023 1726853634.79591: variable 'ansible_search_path' from source: unknown 28023 1726853634.79594: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28023 1726853634.79611: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28023 1726853634.79634: variable 'ansible_search_path' from source: unknown 28023 1726853634.87686: variable 'ansible_managed' from source: unknown 28023 1726853634.87803: variable 'omit' from source: magic vars 28023 1726853634.87830: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853634.87857: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853634.87890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853634.87986: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853634.87991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853634.87994: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853634.87996: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853634.87998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853634.88075: Set connection var ansible_shell_type to sh 28023 1726853634.88078: Set connection var ansible_shell_executable to /bin/sh 28023 1726853634.88080: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853634.88083: Set connection var ansible_connection to ssh 28023 1726853634.88084: Set connection var ansible_pipelining to False 28023 1726853634.88086: Set connection var ansible_timeout to 10 28023 1726853634.88111: variable 'ansible_shell_executable' from source: unknown 28023 1726853634.88114: variable 'ansible_connection' from source: unknown 28023 1726853634.88117: variable 'ansible_module_compression' from source: unknown 28023 1726853634.88119: variable 'ansible_shell_type' from source: unknown 28023 1726853634.88121: variable 'ansible_shell_executable' from source: unknown 28023 1726853634.88124: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853634.88126: variable 'ansible_pipelining' from source: unknown 28023 1726853634.88128: variable 'ansible_timeout' from source: unknown 28023 1726853634.88130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853634.88329: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28023 1726853634.88341: variable 'omit' from source: magic vars 28023 1726853634.88344: starting attempt loop 28023 1726853634.88347: running the handler 28023 1726853634.88349: _low_level_execute_command(): starting 28023 1726853634.88351: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853634.89100: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853634.89104: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853634.89107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853634.89120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853634.89132: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853634.89199: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853634.89308: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853634.89352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853634.89576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853634.91247: stdout chunk (state=3): >>>/root <<< 28023 1726853634.91347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853634.91400: stderr chunk (state=3): >>><<< 28023 1726853634.91413: stdout chunk (state=3): >>><<< 28023 1726853634.91441: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853634.91454: _low_level_execute_command(): starting 28023 1726853634.91463: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853634.9144187-29274-253355939840174 `" && echo ansible-tmp-1726853634.9144187-29274-253355939840174="` echo /root/.ansible/tmp/ansible-tmp-1726853634.9144187-29274-253355939840174 `" ) && sleep 0' 28023 1726853634.92032: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853634.92038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853634.92054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853634.92077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853634.92088: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853634.92094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853634.92164: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853634.92201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853634.92255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853634.94180: stdout chunk (state=3): >>>ansible-tmp-1726853634.9144187-29274-253355939840174=/root/.ansible/tmp/ansible-tmp-1726853634.9144187-29274-253355939840174 <<< 28023 1726853634.94333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853634.94336: stdout chunk (state=3): >>><<< 28023 1726853634.94338: stderr chunk (state=3): >>><<< 28023 1726853634.94362: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853634.9144187-29274-253355939840174=/root/.ansible/tmp/ansible-tmp-1726853634.9144187-29274-253355939840174 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853634.94585: variable 'ansible_module_compression' from source: unknown 28023 1726853634.94589: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 28023 1726853634.94591: variable 'ansible_facts' from source: unknown 28023 1726853634.94637: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853634.9144187-29274-253355939840174/AnsiballZ_network_connections.py 28023 1726853634.94814: Sending initial data 28023 1726853634.94822: Sent initial data (168 bytes) 28023 1726853634.95426: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853634.95443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853634.95485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853634.95501: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28023 1726853634.95580: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853634.95606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853634.95622: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853634.95641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853634.95730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853634.97330: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853634.97403: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853634.97489: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmp4rx39z3s /root/.ansible/tmp/ansible-tmp-1726853634.9144187-29274-253355939840174/AnsiballZ_network_connections.py <<< 28023 1726853634.97499: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853634.9144187-29274-253355939840174/AnsiballZ_network_connections.py" <<< 28023 1726853634.97552: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmp4rx39z3s" to remote "/root/.ansible/tmp/ansible-tmp-1726853634.9144187-29274-253355939840174/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853634.9144187-29274-253355939840174/AnsiballZ_network_connections.py" <<< 28023 1726853634.98781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853634.98784: stdout chunk (state=3): >>><<< 28023 1726853634.98786: stderr chunk (state=3): >>><<< 28023 1726853634.98788: done transferring module to remote 28023 1726853634.98790: _low_level_execute_command(): starting 28023 1726853634.98793: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853634.9144187-29274-253355939840174/ /root/.ansible/tmp/ansible-tmp-1726853634.9144187-29274-253355939840174/AnsiballZ_network_connections.py && sleep 0' 28023 1726853634.99417: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853634.99444: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853634.99485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853634.99501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853634.99517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853634.99551: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853634.99614: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853634.99637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853634.99655: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853634.99744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853635.01695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853635.01699: stdout chunk (state=3): >>><<< 28023 1726853635.01701: stderr chunk (state=3): >>><<< 28023 1726853635.01720: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853635.01778: _low_level_execute_command(): starting 28023 1726853635.01781: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853634.9144187-29274-253355939840174/AnsiballZ_network_connections.py && sleep 0' 28023 1726853635.02394: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853635.02406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853635.02429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853635.02490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853635.02554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853635.02577: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853635.02597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853635.02696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853635.31423: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 3c3e532f-c676-4575-81e5-c6f885a09e34\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "32:38:a6:2e:17:d5", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "32:38:a6:2e:17:d5", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28023 1726853635.33273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853635.33277: stdout chunk (state=3): >>><<< 28023 1726853635.33280: stderr chunk (state=3): >>><<< 28023 1726853635.33298: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 3c3e532f-c676-4575-81e5-c6f885a09e34\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "32:38:a6:2e:17:d5", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "32:38:a6:2e:17:d5", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853635.33435: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest1', 'mac': '32:38:a6:2e:17:d5', 'type': 'ethernet', 'autoconnect': False, 'ip': {'address': ['198.51.100.4/24', '2001:db8::6/32'], 'route': [{'network': '198.58.10.64', 'prefix': 26, 'gateway': '198.51.100.102', 'metric': 4}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853634.9144187-29274-253355939840174/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853635.33439: _low_level_execute_command(): starting 28023 1726853635.33441: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853634.9144187-29274-253355939840174/ > /dev/null 2>&1 && sleep 0' 28023 1726853635.34390: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853635.34406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853635.34420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853635.34436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853635.34452: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853635.34548: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853635.34690: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853635.34782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853635.36703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853635.36715: stdout chunk (state=3): >>><<< 28023 1726853635.36743: stderr chunk (state=3): >>><<< 28023 1726853635.36767: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853635.36789: handler run complete 28023 1726853635.36831: attempt loop complete, returning result 28023 1726853635.36839: _execute() done 28023 1726853635.36846: dumping result to json 28023 1726853635.36856: done dumping result, returning 28023 1726853635.36888: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-fdb6-dad7-00000000007c] 28023 1726853635.36891: sending task result for task 02083763-bbaf-fdb6-dad7-00000000007c changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "address": [ "198.51.100.4/24", "2001:db8::6/32" ], "route": [ { "gateway": "198.51.100.102", "metric": 4, "network": "198.58.10.64", "prefix": 26 } ] }, "mac": "32:38:a6:2e:17:d5", "name": "ethtest1", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 3c3e532f-c676-4575-81e5-c6f885a09e34 28023 1726853635.37246: no more pending results, returning what we have 28023 1726853635.37254: results queue empty 28023 1726853635.37255: checking for any_errors_fatal 28023 1726853635.37269: done checking for any_errors_fatal 28023 1726853635.37270: checking for max_fail_percentage 28023 1726853635.37312: done checking for max_fail_percentage 28023 1726853635.37314: checking to see if all hosts have failed and the running result is not ok 28023 1726853635.37315: done checking to see if all hosts have failed 28023 1726853635.37315: getting the remaining hosts for this loop 28023 1726853635.37317: done getting the remaining hosts for this loop 28023 1726853635.37321: getting the next task for host managed_node3 28023 1726853635.37343: done getting next task for host managed_node3 28023 1726853635.37347: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28023 1726853635.37350: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853635.37361: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000007c 28023 1726853635.37363: WORKER PROCESS EXITING 28023 1726853635.37372: getting variables 28023 1726853635.37374: in VariableManager get_vars() 28023 1726853635.37411: Calling all_inventory to load vars for managed_node3 28023 1726853635.37414: Calling groups_inventory to load vars for managed_node3 28023 1726853635.37416: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853635.37426: Calling all_plugins_play to load vars for managed_node3 28023 1726853635.37429: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853635.37432: Calling groups_plugins_play to load vars for managed_node3 28023 1726853635.38232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853635.39114: done with get_vars() 28023 1726853635.39130: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:33:55 -0400 (0:00:00.652) 0:00:27.476 ****** 28023 1726853635.39253: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 28023 1726853635.39624: worker is 1 (out of 1 available) 28023 1726853635.39640: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 28023 1726853635.39653: done queuing things up, now waiting for results queue to drain 28023 1726853635.39654: waiting for pending results... 28023 1726853635.39866: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 28023 1726853635.40079: in run() - task 02083763-bbaf-fdb6-dad7-00000000007d 28023 1726853635.40133: variable 'ansible_search_path' from source: unknown 28023 1726853635.40136: variable 'ansible_search_path' from source: unknown 28023 1726853635.40202: calling self._execute() 28023 1726853635.40279: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853635.40283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853635.40292: variable 'omit' from source: magic vars 28023 1726853635.40580: variable 'ansible_distribution_major_version' from source: facts 28023 1726853635.40588: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853635.40668: variable 'network_state' from source: role '' defaults 28023 1726853635.40678: Evaluated conditional (network_state != {}): False 28023 1726853635.40689: when evaluation is False, skipping this task 28023 1726853635.40693: _execute() done 28023 1726853635.40696: dumping result to json 28023 1726853635.40700: done dumping result, returning 28023 1726853635.40703: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-fdb6-dad7-00000000007d] 28023 1726853635.40705: sending task result for task 02083763-bbaf-fdb6-dad7-00000000007d 28023 1726853635.40785: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000007d 28023 1726853635.40793: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28023 1726853635.40842: no more pending results, returning what we have 28023 1726853635.40845: results queue empty 28023 1726853635.40846: checking for any_errors_fatal 28023 1726853635.40858: done checking for any_errors_fatal 28023 1726853635.40859: checking for max_fail_percentage 28023 1726853635.40861: done checking for max_fail_percentage 28023 1726853635.40862: checking to see if all hosts have failed and the running result is not ok 28023 1726853635.40862: done checking to see if all hosts have failed 28023 1726853635.40863: getting the remaining hosts for this loop 28023 1726853635.40865: done getting the remaining hosts for this loop 28023 1726853635.40868: getting the next task for host managed_node3 28023 1726853635.40876: done getting next task for host managed_node3 28023 1726853635.40880: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28023 1726853635.40883: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853635.40901: getting variables 28023 1726853635.40902: in VariableManager get_vars() 28023 1726853635.40936: Calling all_inventory to load vars for managed_node3 28023 1726853635.40939: Calling groups_inventory to load vars for managed_node3 28023 1726853635.40941: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853635.40950: Calling all_plugins_play to load vars for managed_node3 28023 1726853635.40953: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853635.40955: Calling groups_plugins_play to load vars for managed_node3 28023 1726853635.41796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853635.43020: done with get_vars() 28023 1726853635.43046: done getting variables 28023 1726853635.43110: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:33:55 -0400 (0:00:00.038) 0:00:27.515 ****** 28023 1726853635.43149: entering _queue_task() for managed_node3/debug 28023 1726853635.43429: worker is 1 (out of 1 available) 28023 1726853635.43443: exiting _queue_task() for managed_node3/debug 28023 1726853635.43674: done queuing things up, now waiting for results queue to drain 28023 1726853635.43676: waiting for pending results... 28023 1726853635.43754: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28023 1726853635.43869: in run() - task 02083763-bbaf-fdb6-dad7-00000000007e 28023 1726853635.43885: variable 'ansible_search_path' from source: unknown 28023 1726853635.43892: variable 'ansible_search_path' from source: unknown 28023 1726853635.43922: calling self._execute() 28023 1726853635.44015: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853635.44019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853635.44027: variable 'omit' from source: magic vars 28023 1726853635.44303: variable 'ansible_distribution_major_version' from source: facts 28023 1726853635.44314: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853635.44320: variable 'omit' from source: magic vars 28023 1726853635.44366: variable 'omit' from source: magic vars 28023 1726853635.44392: variable 'omit' from source: magic vars 28023 1726853635.44422: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853635.44451: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853635.44470: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853635.44484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853635.44494: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853635.44517: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853635.44521: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853635.44523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853635.44593: Set connection var ansible_shell_type to sh 28023 1726853635.44600: Set connection var ansible_shell_executable to /bin/sh 28023 1726853635.44605: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853635.44610: Set connection var ansible_connection to ssh 28023 1726853635.44615: Set connection var ansible_pipelining to False 28023 1726853635.44620: Set connection var ansible_timeout to 10 28023 1726853635.44640: variable 'ansible_shell_executable' from source: unknown 28023 1726853635.44643: variable 'ansible_connection' from source: unknown 28023 1726853635.44646: variable 'ansible_module_compression' from source: unknown 28023 1726853635.44651: variable 'ansible_shell_type' from source: unknown 28023 1726853635.44653: variable 'ansible_shell_executable' from source: unknown 28023 1726853635.44655: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853635.44657: variable 'ansible_pipelining' from source: unknown 28023 1726853635.44667: variable 'ansible_timeout' from source: unknown 28023 1726853635.44670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853635.44767: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853635.44780: variable 'omit' from source: magic vars 28023 1726853635.44784: starting attempt loop 28023 1726853635.44786: running the handler 28023 1726853635.44877: variable '__network_connections_result' from source: set_fact 28023 1726853635.44923: handler run complete 28023 1726853635.44936: attempt loop complete, returning result 28023 1726853635.44939: _execute() done 28023 1726853635.44942: dumping result to json 28023 1726853635.44944: done dumping result, returning 28023 1726853635.44953: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-fdb6-dad7-00000000007e] 28023 1726853635.44957: sending task result for task 02083763-bbaf-fdb6-dad7-00000000007e 28023 1726853635.45039: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000007e 28023 1726853635.45042: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 3c3e532f-c676-4575-81e5-c6f885a09e34" ] } 28023 1726853635.45105: no more pending results, returning what we have 28023 1726853635.45109: results queue empty 28023 1726853635.45109: checking for any_errors_fatal 28023 1726853635.45117: done checking for any_errors_fatal 28023 1726853635.45117: checking for max_fail_percentage 28023 1726853635.45119: done checking for max_fail_percentage 28023 1726853635.45120: checking to see if all hosts have failed and the running result is not ok 28023 1726853635.45121: done checking to see if all hosts have failed 28023 1726853635.45122: getting the remaining hosts for this loop 28023 1726853635.45123: done getting the remaining hosts for this loop 28023 1726853635.45126: getting the next task for host managed_node3 28023 1726853635.45131: done getting next task for host managed_node3 28023 1726853635.45135: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28023 1726853635.45137: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853635.45146: getting variables 28023 1726853635.45148: in VariableManager get_vars() 28023 1726853635.45185: Calling all_inventory to load vars for managed_node3 28023 1726853635.45187: Calling groups_inventory to load vars for managed_node3 28023 1726853635.45189: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853635.45198: Calling all_plugins_play to load vars for managed_node3 28023 1726853635.45200: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853635.45203: Calling groups_plugins_play to load vars for managed_node3 28023 1726853635.46298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853635.47801: done with get_vars() 28023 1726853635.47822: done getting variables 28023 1726853635.47884: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:33:55 -0400 (0:00:00.047) 0:00:27.563 ****** 28023 1726853635.47916: entering _queue_task() for managed_node3/debug 28023 1726853635.48243: worker is 1 (out of 1 available) 28023 1726853635.48261: exiting _queue_task() for managed_node3/debug 28023 1726853635.48280: done queuing things up, now waiting for results queue to drain 28023 1726853635.48282: waiting for pending results... 28023 1726853635.48534: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28023 1726853635.48637: in run() - task 02083763-bbaf-fdb6-dad7-00000000007f 28023 1726853635.48650: variable 'ansible_search_path' from source: unknown 28023 1726853635.48654: variable 'ansible_search_path' from source: unknown 28023 1726853635.48691: calling self._execute() 28023 1726853635.48768: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853635.48775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853635.48787: variable 'omit' from source: magic vars 28023 1726853635.49056: variable 'ansible_distribution_major_version' from source: facts 28023 1726853635.49069: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853635.49076: variable 'omit' from source: magic vars 28023 1726853635.49120: variable 'omit' from source: magic vars 28023 1726853635.49144: variable 'omit' from source: magic vars 28023 1726853635.49179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853635.49205: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853635.49225: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853635.49239: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853635.49248: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853635.49276: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853635.49279: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853635.49282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853635.49348: Set connection var ansible_shell_type to sh 28023 1726853635.49355: Set connection var ansible_shell_executable to /bin/sh 28023 1726853635.49362: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853635.49367: Set connection var ansible_connection to ssh 28023 1726853635.49376: Set connection var ansible_pipelining to False 28023 1726853635.49379: Set connection var ansible_timeout to 10 28023 1726853635.49398: variable 'ansible_shell_executable' from source: unknown 28023 1726853635.49401: variable 'ansible_connection' from source: unknown 28023 1726853635.49404: variable 'ansible_module_compression' from source: unknown 28023 1726853635.49407: variable 'ansible_shell_type' from source: unknown 28023 1726853635.49409: variable 'ansible_shell_executable' from source: unknown 28023 1726853635.49411: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853635.49413: variable 'ansible_pipelining' from source: unknown 28023 1726853635.49416: variable 'ansible_timeout' from source: unknown 28023 1726853635.49421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853635.49523: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853635.49531: variable 'omit' from source: magic vars 28023 1726853635.49536: starting attempt loop 28023 1726853635.49539: running the handler 28023 1726853635.49584: variable '__network_connections_result' from source: set_fact 28023 1726853635.49641: variable '__network_connections_result' from source: set_fact 28023 1726853635.49739: handler run complete 28023 1726853635.49760: attempt loop complete, returning result 28023 1726853635.49763: _execute() done 28023 1726853635.49775: dumping result to json 28023 1726853635.49778: done dumping result, returning 28023 1726853635.49784: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-fdb6-dad7-00000000007f] 28023 1726853635.49788: sending task result for task 02083763-bbaf-fdb6-dad7-00000000007f 28023 1726853635.49874: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000007f 28023 1726853635.49877: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "address": [ "198.51.100.4/24", "2001:db8::6/32" ], "route": [ { "gateway": "198.51.100.102", "metric": 4, "network": "198.58.10.64", "prefix": 26 } ] }, "mac": "32:38:a6:2e:17:d5", "name": "ethtest1", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 3c3e532f-c676-4575-81e5-c6f885a09e34\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 3c3e532f-c676-4575-81e5-c6f885a09e34" ] } } 28023 1726853635.50000: no more pending results, returning what we have 28023 1726853635.50003: results queue empty 28023 1726853635.50004: checking for any_errors_fatal 28023 1726853635.50008: done checking for any_errors_fatal 28023 1726853635.50009: checking for max_fail_percentage 28023 1726853635.50011: done checking for max_fail_percentage 28023 1726853635.50011: checking to see if all hosts have failed and the running result is not ok 28023 1726853635.50012: done checking to see if all hosts have failed 28023 1726853635.50013: getting the remaining hosts for this loop 28023 1726853635.50014: done getting the remaining hosts for this loop 28023 1726853635.50017: getting the next task for host managed_node3 28023 1726853635.50021: done getting next task for host managed_node3 28023 1726853635.50024: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28023 1726853635.50027: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853635.50035: getting variables 28023 1726853635.50036: in VariableManager get_vars() 28023 1726853635.50076: Calling all_inventory to load vars for managed_node3 28023 1726853635.50084: Calling groups_inventory to load vars for managed_node3 28023 1726853635.50089: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853635.50097: Calling all_plugins_play to load vars for managed_node3 28023 1726853635.50101: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853635.50103: Calling groups_plugins_play to load vars for managed_node3 28023 1726853635.51316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853635.52320: done with get_vars() 28023 1726853635.52342: done getting variables 28023 1726853635.52406: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:33:55 -0400 (0:00:00.045) 0:00:27.608 ****** 28023 1726853635.52440: entering _queue_task() for managed_node3/debug 28023 1726853635.52742: worker is 1 (out of 1 available) 28023 1726853635.52754: exiting _queue_task() for managed_node3/debug 28023 1726853635.52773: done queuing things up, now waiting for results queue to drain 28023 1726853635.52775: waiting for pending results... 28023 1726853635.53132: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28023 1726853635.53220: in run() - task 02083763-bbaf-fdb6-dad7-000000000080 28023 1726853635.53249: variable 'ansible_search_path' from source: unknown 28023 1726853635.53253: variable 'ansible_search_path' from source: unknown 28023 1726853635.53342: calling self._execute() 28023 1726853635.53551: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853635.53555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853635.53562: variable 'omit' from source: magic vars 28023 1726853635.53803: variable 'ansible_distribution_major_version' from source: facts 28023 1726853635.53806: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853635.53945: variable 'network_state' from source: role '' defaults 28023 1726853635.54020: Evaluated conditional (network_state != {}): False 28023 1726853635.54024: when evaluation is False, skipping this task 28023 1726853635.54026: _execute() done 28023 1726853635.54028: dumping result to json 28023 1726853635.54031: done dumping result, returning 28023 1726853635.54033: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-fdb6-dad7-000000000080] 28023 1726853635.54035: sending task result for task 02083763-bbaf-fdb6-dad7-000000000080 skipping: [managed_node3] => { "false_condition": "network_state != {}" } 28023 1726853635.54281: no more pending results, returning what we have 28023 1726853635.54285: results queue empty 28023 1726853635.54286: checking for any_errors_fatal 28023 1726853635.54301: done checking for any_errors_fatal 28023 1726853635.54302: checking for max_fail_percentage 28023 1726853635.54305: done checking for max_fail_percentage 28023 1726853635.54306: checking to see if all hosts have failed and the running result is not ok 28023 1726853635.54307: done checking to see if all hosts have failed 28023 1726853635.54308: getting the remaining hosts for this loop 28023 1726853635.54309: done getting the remaining hosts for this loop 28023 1726853635.54313: getting the next task for host managed_node3 28023 1726853635.54320: done getting next task for host managed_node3 28023 1726853635.54324: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28023 1726853635.54329: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853635.54348: getting variables 28023 1726853635.54349: in VariableManager get_vars() 28023 1726853635.54393: Calling all_inventory to load vars for managed_node3 28023 1726853635.54397: Calling groups_inventory to load vars for managed_node3 28023 1726853635.54400: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853635.54414: Calling all_plugins_play to load vars for managed_node3 28023 1726853635.54418: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853635.54421: Calling groups_plugins_play to load vars for managed_node3 28023 1726853635.55136: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000080 28023 1726853635.55139: WORKER PROCESS EXITING 28023 1726853635.57822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853635.59426: done with get_vars() 28023 1726853635.59449: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:33:55 -0400 (0:00:00.070) 0:00:27.679 ****** 28023 1726853635.59538: entering _queue_task() for managed_node3/ping 28023 1726853635.59884: worker is 1 (out of 1 available) 28023 1726853635.60013: exiting _queue_task() for managed_node3/ping 28023 1726853635.60025: done queuing things up, now waiting for results queue to drain 28023 1726853635.60027: waiting for pending results... 28023 1726853635.60240: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 28023 1726853635.60413: in run() - task 02083763-bbaf-fdb6-dad7-000000000081 28023 1726853635.60472: variable 'ansible_search_path' from source: unknown 28023 1726853635.60477: variable 'ansible_search_path' from source: unknown 28023 1726853635.60496: calling self._execute() 28023 1726853635.60613: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853635.60625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853635.60688: variable 'omit' from source: magic vars 28023 1726853635.61037: variable 'ansible_distribution_major_version' from source: facts 28023 1726853635.61056: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853635.61067: variable 'omit' from source: magic vars 28023 1726853635.61130: variable 'omit' from source: magic vars 28023 1726853635.61169: variable 'omit' from source: magic vars 28023 1726853635.61214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853635.61341: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853635.61345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853635.61348: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853635.61350: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853635.61360: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853635.61368: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853635.61379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853635.61487: Set connection var ansible_shell_type to sh 28023 1726853635.61500: Set connection var ansible_shell_executable to /bin/sh 28023 1726853635.61511: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853635.61522: Set connection var ansible_connection to ssh 28023 1726853635.61532: Set connection var ansible_pipelining to False 28023 1726853635.61543: Set connection var ansible_timeout to 10 28023 1726853635.61581: variable 'ansible_shell_executable' from source: unknown 28023 1726853635.61590: variable 'ansible_connection' from source: unknown 28023 1726853635.61598: variable 'ansible_module_compression' from source: unknown 28023 1726853635.61606: variable 'ansible_shell_type' from source: unknown 28023 1726853635.61613: variable 'ansible_shell_executable' from source: unknown 28023 1726853635.61621: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853635.61666: variable 'ansible_pipelining' from source: unknown 28023 1726853635.61669: variable 'ansible_timeout' from source: unknown 28023 1726853635.61674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853635.61857: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28023 1726853635.61883: variable 'omit' from source: magic vars 28023 1726853635.61895: starting attempt loop 28023 1726853635.61903: running the handler 28023 1726853635.61976: _low_level_execute_command(): starting 28023 1726853635.61981: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853635.62768: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853635.62782: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853635.62956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853635.64763: stdout chunk (state=3): >>>/root <<< 28023 1726853635.64826: stdout chunk (state=3): >>><<< 28023 1726853635.64835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853635.64846: stderr chunk (state=3): >>><<< 28023 1726853635.64881: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853635.64902: _low_level_execute_command(): starting 28023 1726853635.64989: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853635.6488912-29318-102278621635113 `" && echo ansible-tmp-1726853635.6488912-29318-102278621635113="` echo /root/.ansible/tmp/ansible-tmp-1726853635.6488912-29318-102278621635113 `" ) && sleep 0' 28023 1726853635.65565: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853635.65585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853635.65693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853635.65717: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853635.65731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853635.65829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853635.67792: stdout chunk (state=3): >>>ansible-tmp-1726853635.6488912-29318-102278621635113=/root/.ansible/tmp/ansible-tmp-1726853635.6488912-29318-102278621635113 <<< 28023 1726853635.68177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853635.68180: stdout chunk (state=3): >>><<< 28023 1726853635.68183: stderr chunk (state=3): >>><<< 28023 1726853635.68186: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853635.6488912-29318-102278621635113=/root/.ansible/tmp/ansible-tmp-1726853635.6488912-29318-102278621635113 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853635.68188: variable 'ansible_module_compression' from source: unknown 28023 1726853635.68191: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 28023 1726853635.68193: variable 'ansible_facts' from source: unknown 28023 1726853635.68199: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853635.6488912-29318-102278621635113/AnsiballZ_ping.py 28023 1726853635.68388: Sending initial data 28023 1726853635.68398: Sent initial data (153 bytes) 28023 1726853635.68968: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853635.68985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853635.68999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853635.69017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853635.69037: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853635.69079: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853635.69093: stderr chunk (state=3): >>>debug2: match found <<< 28023 1726853635.69178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853635.69201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853635.69245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853635.69313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853635.70954: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853635.71035: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853635.71128: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpgqtud043 /root/.ansible/tmp/ansible-tmp-1726853635.6488912-29318-102278621635113/AnsiballZ_ping.py <<< 28023 1726853635.71140: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853635.6488912-29318-102278621635113/AnsiballZ_ping.py" <<< 28023 1726853635.71187: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpgqtud043" to remote "/root/.ansible/tmp/ansible-tmp-1726853635.6488912-29318-102278621635113/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853635.6488912-29318-102278621635113/AnsiballZ_ping.py" <<< 28023 1726853635.71976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853635.71990: stdout chunk (state=3): >>><<< 28023 1726853635.72001: stderr chunk (state=3): >>><<< 28023 1726853635.72060: done transferring module to remote 28023 1726853635.72077: _low_level_execute_command(): starting 28023 1726853635.72091: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853635.6488912-29318-102278621635113/ /root/.ansible/tmp/ansible-tmp-1726853635.6488912-29318-102278621635113/AnsiballZ_ping.py && sleep 0' 28023 1726853635.72737: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853635.72751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853635.72769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853635.72788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853635.72832: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853635.72846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853635.72938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853635.72983: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853635.73059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853635.74987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853635.74991: stdout chunk (state=3): >>><<< 28023 1726853635.74994: stderr chunk (state=3): >>><<< 28023 1726853635.75094: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853635.75097: _low_level_execute_command(): starting 28023 1726853635.75100: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853635.6488912-29318-102278621635113/AnsiballZ_ping.py && sleep 0' 28023 1726853635.75664: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853635.75683: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853635.75738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853635.75805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853635.75822: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853635.75856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853635.75954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853635.91310: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28023 1726853635.92748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853635.92752: stdout chunk (state=3): >>><<< 28023 1726853635.92754: stderr chunk (state=3): >>><<< 28023 1726853635.92766: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853635.92796: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853635.6488912-29318-102278621635113/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853635.92803: _low_level_execute_command(): starting 28023 1726853635.92977: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853635.6488912-29318-102278621635113/ > /dev/null 2>&1 && sleep 0' 28023 1726853635.93490: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853635.93500: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853635.93511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853635.93534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853635.93547: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853635.93555: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853635.93569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853635.93652: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853635.93675: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853635.93695: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853635.93707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853635.93798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853635.95661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853635.95689: stderr chunk (state=3): >>><<< 28023 1726853635.95697: stdout chunk (state=3): >>><<< 28023 1726853635.95708: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853635.95713: handler run complete 28023 1726853635.95726: attempt loop complete, returning result 28023 1726853635.95729: _execute() done 28023 1726853635.95731: dumping result to json 28023 1726853635.95734: done dumping result, returning 28023 1726853635.95742: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-fdb6-dad7-000000000081] 28023 1726853635.95747: sending task result for task 02083763-bbaf-fdb6-dad7-000000000081 28023 1726853635.95835: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000081 28023 1726853635.95838: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 28023 1726853635.95922: no more pending results, returning what we have 28023 1726853635.95926: results queue empty 28023 1726853635.95927: checking for any_errors_fatal 28023 1726853635.95935: done checking for any_errors_fatal 28023 1726853635.95936: checking for max_fail_percentage 28023 1726853635.95937: done checking for max_fail_percentage 28023 1726853635.95938: checking to see if all hosts have failed and the running result is not ok 28023 1726853635.95939: done checking to see if all hosts have failed 28023 1726853635.95940: getting the remaining hosts for this loop 28023 1726853635.95941: done getting the remaining hosts for this loop 28023 1726853635.95945: getting the next task for host managed_node3 28023 1726853635.95954: done getting next task for host managed_node3 28023 1726853635.95956: ^ task is: TASK: meta (role_complete) 28023 1726853635.95959: ^ state is: HOST STATE: block=3, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853635.95969: getting variables 28023 1726853635.95973: in VariableManager get_vars() 28023 1726853635.96012: Calling all_inventory to load vars for managed_node3 28023 1726853635.96014: Calling groups_inventory to load vars for managed_node3 28023 1726853635.96016: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853635.96026: Calling all_plugins_play to load vars for managed_node3 28023 1726853635.96028: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853635.96030: Calling groups_plugins_play to load vars for managed_node3 28023 1726853635.96915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853635.98167: done with get_vars() 28023 1726853635.98186: done getting variables 28023 1726853635.98243: done queuing things up, now waiting for results queue to drain 28023 1726853635.98244: results queue empty 28023 1726853635.98245: checking for any_errors_fatal 28023 1726853635.98247: done checking for any_errors_fatal 28023 1726853635.98247: checking for max_fail_percentage 28023 1726853635.98248: done checking for max_fail_percentage 28023 1726853635.98248: checking to see if all hosts have failed and the running result is not ok 28023 1726853635.98249: done checking to see if all hosts have failed 28023 1726853635.98249: getting the remaining hosts for this loop 28023 1726853635.98250: done getting the remaining hosts for this loop 28023 1726853635.98251: getting the next task for host managed_node3 28023 1726853635.98254: done getting next task for host managed_node3 28023 1726853635.98256: ^ task is: TASK: Assert that the warning about specifying the route without the output device is logged for initscripts provider 28023 1726853635.98257: ^ state is: HOST STATE: block=3, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853635.98259: getting variables 28023 1726853635.98259: in VariableManager get_vars() 28023 1726853635.98269: Calling all_inventory to load vars for managed_node3 28023 1726853635.98273: Calling groups_inventory to load vars for managed_node3 28023 1726853635.98275: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853635.98279: Calling all_plugins_play to load vars for managed_node3 28023 1726853635.98280: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853635.98282: Calling groups_plugins_play to load vars for managed_node3 28023 1726853635.98976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853635.99915: done with get_vars() 28023 1726853635.99936: done getting variables 28023 1726853635.99981: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the warning about specifying the route without the output device is logged for initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:122 Friday 20 September 2024 13:33:55 -0400 (0:00:00.404) 0:00:28.083 ****** 28023 1726853636.00011: entering _queue_task() for managed_node3/assert 28023 1726853636.00392: worker is 1 (out of 1 available) 28023 1726853636.00404: exiting _queue_task() for managed_node3/assert 28023 1726853636.00418: done queuing things up, now waiting for results queue to drain 28023 1726853636.00419: waiting for pending results... 28023 1726853636.00890: running TaskExecutor() for managed_node3/TASK: Assert that the warning about specifying the route without the output device is logged for initscripts provider 28023 1726853636.00895: in run() - task 02083763-bbaf-fdb6-dad7-0000000000b1 28023 1726853636.00898: variable 'ansible_search_path' from source: unknown 28023 1726853636.00902: calling self._execute() 28023 1726853636.00984: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853636.00994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853636.01008: variable 'omit' from source: magic vars 28023 1726853636.01361: variable 'ansible_distribution_major_version' from source: facts 28023 1726853636.01370: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853636.01448: variable 'network_provider' from source: set_fact 28023 1726853636.01452: Evaluated conditional (network_provider == "initscripts"): False 28023 1726853636.01454: when evaluation is False, skipping this task 28023 1726853636.01460: _execute() done 28023 1726853636.01463: dumping result to json 28023 1726853636.01465: done dumping result, returning 28023 1726853636.01473: done running TaskExecutor() for managed_node3/TASK: Assert that the warning about specifying the route without the output device is logged for initscripts provider [02083763-bbaf-fdb6-dad7-0000000000b1] 28023 1726853636.01476: sending task result for task 02083763-bbaf-fdb6-dad7-0000000000b1 28023 1726853636.01565: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000000b1 28023 1726853636.01567: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28023 1726853636.01617: no more pending results, returning what we have 28023 1726853636.01620: results queue empty 28023 1726853636.01621: checking for any_errors_fatal 28023 1726853636.01622: done checking for any_errors_fatal 28023 1726853636.01623: checking for max_fail_percentage 28023 1726853636.01625: done checking for max_fail_percentage 28023 1726853636.01626: checking to see if all hosts have failed and the running result is not ok 28023 1726853636.01627: done checking to see if all hosts have failed 28023 1726853636.01627: getting the remaining hosts for this loop 28023 1726853636.01629: done getting the remaining hosts for this loop 28023 1726853636.01632: getting the next task for host managed_node3 28023 1726853636.01638: done getting next task for host managed_node3 28023 1726853636.01640: ^ task is: TASK: Assert that no warning is logged for nm provider 28023 1726853636.01643: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853636.01646: getting variables 28023 1726853636.01647: in VariableManager get_vars() 28023 1726853636.01695: Calling all_inventory to load vars for managed_node3 28023 1726853636.01697: Calling groups_inventory to load vars for managed_node3 28023 1726853636.01699: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853636.01710: Calling all_plugins_play to load vars for managed_node3 28023 1726853636.01712: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853636.01715: Calling groups_plugins_play to load vars for managed_node3 28023 1726853636.02490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853636.03950: done with get_vars() 28023 1726853636.03975: done getting variables 28023 1726853636.04032: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that no warning is logged for nm provider] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:129 Friday 20 September 2024 13:33:56 -0400 (0:00:00.040) 0:00:28.124 ****** 28023 1726853636.04061: entering _queue_task() for managed_node3/assert 28023 1726853636.04380: worker is 1 (out of 1 available) 28023 1726853636.04392: exiting _queue_task() for managed_node3/assert 28023 1726853636.04405: done queuing things up, now waiting for results queue to drain 28023 1726853636.04406: waiting for pending results... 28023 1726853636.04665: running TaskExecutor() for managed_node3/TASK: Assert that no warning is logged for nm provider 28023 1726853636.04728: in run() - task 02083763-bbaf-fdb6-dad7-0000000000b2 28023 1726853636.04742: variable 'ansible_search_path' from source: unknown 28023 1726853636.04775: calling self._execute() 28023 1726853636.04864: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853636.04870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853636.04881: variable 'omit' from source: magic vars 28023 1726853636.05166: variable 'ansible_distribution_major_version' from source: facts 28023 1726853636.05178: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853636.05255: variable 'network_provider' from source: set_fact 28023 1726853636.05258: Evaluated conditional (network_provider == "nm"): True 28023 1726853636.05267: variable 'omit' from source: magic vars 28023 1726853636.05289: variable 'omit' from source: magic vars 28023 1726853636.05316: variable 'omit' from source: magic vars 28023 1726853636.05348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853636.05379: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853636.05395: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853636.05409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853636.05419: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853636.05444: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853636.05446: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853636.05449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853636.05521: Set connection var ansible_shell_type to sh 28023 1726853636.05528: Set connection var ansible_shell_executable to /bin/sh 28023 1726853636.05533: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853636.05539: Set connection var ansible_connection to ssh 28023 1726853636.05543: Set connection var ansible_pipelining to False 28023 1726853636.05548: Set connection var ansible_timeout to 10 28023 1726853636.05572: variable 'ansible_shell_executable' from source: unknown 28023 1726853636.05576: variable 'ansible_connection' from source: unknown 28023 1726853636.05578: variable 'ansible_module_compression' from source: unknown 28023 1726853636.05580: variable 'ansible_shell_type' from source: unknown 28023 1726853636.05583: variable 'ansible_shell_executable' from source: unknown 28023 1726853636.05585: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853636.05589: variable 'ansible_pipelining' from source: unknown 28023 1726853636.05591: variable 'ansible_timeout' from source: unknown 28023 1726853636.05599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853636.05699: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853636.05710: variable 'omit' from source: magic vars 28023 1726853636.05713: starting attempt loop 28023 1726853636.05716: running the handler 28023 1726853636.05825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853636.05996: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853636.06026: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853636.10358: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853636.10392: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853636.10454: variable '__network_connections_result' from source: set_fact 28023 1726853636.10475: Evaluated conditional (__network_connections_result.stderr is not search("")): True 28023 1726853636.10481: handler run complete 28023 1726853636.10496: attempt loop complete, returning result 28023 1726853636.10499: _execute() done 28023 1726853636.10501: dumping result to json 28023 1726853636.10504: done dumping result, returning 28023 1726853636.10506: done running TaskExecutor() for managed_node3/TASK: Assert that no warning is logged for nm provider [02083763-bbaf-fdb6-dad7-0000000000b2] 28023 1726853636.10508: sending task result for task 02083763-bbaf-fdb6-dad7-0000000000b2 28023 1726853636.10590: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000000b2 28023 1726853636.10593: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 28023 1726853636.10640: no more pending results, returning what we have 28023 1726853636.10643: results queue empty 28023 1726853636.10644: checking for any_errors_fatal 28023 1726853636.10651: done checking for any_errors_fatal 28023 1726853636.10652: checking for max_fail_percentage 28023 1726853636.10653: done checking for max_fail_percentage 28023 1726853636.10654: checking to see if all hosts have failed and the running result is not ok 28023 1726853636.10655: done checking to see if all hosts have failed 28023 1726853636.10656: getting the remaining hosts for this loop 28023 1726853636.10657: done getting the remaining hosts for this loop 28023 1726853636.10660: getting the next task for host managed_node3 28023 1726853636.10669: done getting next task for host managed_node3 28023 1726853636.10674: ^ task is: TASK: Bring down test devices and profiles 28023 1726853636.10677: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853636.10681: getting variables 28023 1726853636.10682: in VariableManager get_vars() 28023 1726853636.10722: Calling all_inventory to load vars for managed_node3 28023 1726853636.10724: Calling groups_inventory to load vars for managed_node3 28023 1726853636.10726: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853636.10736: Calling all_plugins_play to load vars for managed_node3 28023 1726853636.10738: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853636.10741: Calling groups_plugins_play to load vars for managed_node3 28023 1726853636.14742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853636.15584: done with get_vars() 28023 1726853636.15598: done getting variables TASK [Bring down test devices and profiles] ************************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:140 Friday 20 September 2024 13:33:56 -0400 (0:00:00.115) 0:00:28.240 ****** 28023 1726853636.15656: entering _queue_task() for managed_node3/include_role 28023 1726853636.15658: Creating lock for include_role 28023 1726853636.15920: worker is 1 (out of 1 available) 28023 1726853636.15933: exiting _queue_task() for managed_node3/include_role 28023 1726853636.15946: done queuing things up, now waiting for results queue to drain 28023 1726853636.15948: waiting for pending results... 28023 1726853636.16140: running TaskExecutor() for managed_node3/TASK: Bring down test devices and profiles 28023 1726853636.16223: in run() - task 02083763-bbaf-fdb6-dad7-0000000000b4 28023 1726853636.16236: variable 'ansible_search_path' from source: unknown 28023 1726853636.16268: calling self._execute() 28023 1726853636.16347: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853636.16350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853636.16361: variable 'omit' from source: magic vars 28023 1726853636.16650: variable 'ansible_distribution_major_version' from source: facts 28023 1726853636.16662: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853636.16668: _execute() done 28023 1726853636.16673: dumping result to json 28023 1726853636.16676: done dumping result, returning 28023 1726853636.16682: done running TaskExecutor() for managed_node3/TASK: Bring down test devices and profiles [02083763-bbaf-fdb6-dad7-0000000000b4] 28023 1726853636.16688: sending task result for task 02083763-bbaf-fdb6-dad7-0000000000b4 28023 1726853636.16793: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000000b4 28023 1726853636.16796: WORKER PROCESS EXITING 28023 1726853636.16839: no more pending results, returning what we have 28023 1726853636.16843: in VariableManager get_vars() 28023 1726853636.16887: Calling all_inventory to load vars for managed_node3 28023 1726853636.16891: Calling groups_inventory to load vars for managed_node3 28023 1726853636.16893: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853636.16908: Calling all_plugins_play to load vars for managed_node3 28023 1726853636.16910: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853636.16913: Calling groups_plugins_play to load vars for managed_node3 28023 1726853636.17660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853636.18679: done with get_vars() 28023 1726853636.18692: variable 'ansible_search_path' from source: unknown 28023 1726853636.18841: variable 'omit' from source: magic vars 28023 1726853636.18864: variable 'omit' from source: magic vars 28023 1726853636.18875: variable 'omit' from source: magic vars 28023 1726853636.18878: we have included files to process 28023 1726853636.18878: generating all_blocks data 28023 1726853636.18880: done generating all_blocks data 28023 1726853636.18884: processing included file: fedora.linux_system_roles.network 28023 1726853636.18897: in VariableManager get_vars() 28023 1726853636.18909: done with get_vars() 28023 1726853636.18927: in VariableManager get_vars() 28023 1726853636.18941: done with get_vars() 28023 1726853636.18974: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28023 1726853636.19041: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28023 1726853636.19093: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28023 1726853636.19353: in VariableManager get_vars() 28023 1726853636.19372: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28023 1726853636.20580: iterating over new_blocks loaded from include file 28023 1726853636.20581: in VariableManager get_vars() 28023 1726853636.20595: done with get_vars() 28023 1726853636.20596: filtering new block on tags 28023 1726853636.20836: done filtering new block on tags 28023 1726853636.20840: in VariableManager get_vars() 28023 1726853636.20859: done with get_vars() 28023 1726853636.20861: filtering new block on tags 28023 1726853636.20879: done filtering new block on tags 28023 1726853636.20881: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 28023 1726853636.20886: extending task lists for all hosts with included blocks 28023 1726853636.21104: done extending task lists 28023 1726853636.21106: done processing included files 28023 1726853636.21107: results queue empty 28023 1726853636.21107: checking for any_errors_fatal 28023 1726853636.21111: done checking for any_errors_fatal 28023 1726853636.21112: checking for max_fail_percentage 28023 1726853636.21113: done checking for max_fail_percentage 28023 1726853636.21114: checking to see if all hosts have failed and the running result is not ok 28023 1726853636.21115: done checking to see if all hosts have failed 28023 1726853636.21115: getting the remaining hosts for this loop 28023 1726853636.21117: done getting the remaining hosts for this loop 28023 1726853636.21119: getting the next task for host managed_node3 28023 1726853636.21125: done getting next task for host managed_node3 28023 1726853636.21128: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28023 1726853636.21131: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853636.21141: getting variables 28023 1726853636.21142: in VariableManager get_vars() 28023 1726853636.21159: Calling all_inventory to load vars for managed_node3 28023 1726853636.21161: Calling groups_inventory to load vars for managed_node3 28023 1726853636.21163: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853636.21169: Calling all_plugins_play to load vars for managed_node3 28023 1726853636.21173: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853636.21176: Calling groups_plugins_play to load vars for managed_node3 28023 1726853636.22348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853636.23297: done with get_vars() 28023 1726853636.23311: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:33:56 -0400 (0:00:00.077) 0:00:28.317 ****** 28023 1726853636.23366: entering _queue_task() for managed_node3/include_tasks 28023 1726853636.23630: worker is 1 (out of 1 available) 28023 1726853636.23644: exiting _queue_task() for managed_node3/include_tasks 28023 1726853636.23659: done queuing things up, now waiting for results queue to drain 28023 1726853636.23661: waiting for pending results... 28023 1726853636.23840: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28023 1726853636.23931: in run() - task 02083763-bbaf-fdb6-dad7-000000000641 28023 1726853636.23944: variable 'ansible_search_path' from source: unknown 28023 1726853636.23947: variable 'ansible_search_path' from source: unknown 28023 1726853636.23978: calling self._execute() 28023 1726853636.24058: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853636.24063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853636.24070: variable 'omit' from source: magic vars 28023 1726853636.24576: variable 'ansible_distribution_major_version' from source: facts 28023 1726853636.24579: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853636.24582: _execute() done 28023 1726853636.24584: dumping result to json 28023 1726853636.24587: done dumping result, returning 28023 1726853636.24589: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-fdb6-dad7-000000000641] 28023 1726853636.24591: sending task result for task 02083763-bbaf-fdb6-dad7-000000000641 28023 1726853636.24663: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000641 28023 1726853636.24667: WORKER PROCESS EXITING 28023 1726853636.24711: no more pending results, returning what we have 28023 1726853636.24716: in VariableManager get_vars() 28023 1726853636.24763: Calling all_inventory to load vars for managed_node3 28023 1726853636.24765: Calling groups_inventory to load vars for managed_node3 28023 1726853636.24767: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853636.24779: Calling all_plugins_play to load vars for managed_node3 28023 1726853636.24781: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853636.24784: Calling groups_plugins_play to load vars for managed_node3 28023 1726853636.26112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853636.27736: done with get_vars() 28023 1726853636.27755: variable 'ansible_search_path' from source: unknown 28023 1726853636.27759: variable 'ansible_search_path' from source: unknown 28023 1726853636.27805: we have included files to process 28023 1726853636.27806: generating all_blocks data 28023 1726853636.27808: done generating all_blocks data 28023 1726853636.27811: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28023 1726853636.27812: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28023 1726853636.27814: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28023 1726853636.28404: done processing included file 28023 1726853636.28407: iterating over new_blocks loaded from include file 28023 1726853636.28408: in VariableManager get_vars() 28023 1726853636.28434: done with get_vars() 28023 1726853636.28435: filtering new block on tags 28023 1726853636.28469: done filtering new block on tags 28023 1726853636.28474: in VariableManager get_vars() 28023 1726853636.28498: done with get_vars() 28023 1726853636.28500: filtering new block on tags 28023 1726853636.28538: done filtering new block on tags 28023 1726853636.28540: in VariableManager get_vars() 28023 1726853636.28570: done with get_vars() 28023 1726853636.28573: filtering new block on tags 28023 1726853636.28609: done filtering new block on tags 28023 1726853636.28611: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 28023 1726853636.28616: extending task lists for all hosts with included blocks 28023 1726853636.29687: done extending task lists 28023 1726853636.29688: done processing included files 28023 1726853636.29689: results queue empty 28023 1726853636.29690: checking for any_errors_fatal 28023 1726853636.29693: done checking for any_errors_fatal 28023 1726853636.29693: checking for max_fail_percentage 28023 1726853636.29694: done checking for max_fail_percentage 28023 1726853636.29695: checking to see if all hosts have failed and the running result is not ok 28023 1726853636.29696: done checking to see if all hosts have failed 28023 1726853636.29697: getting the remaining hosts for this loop 28023 1726853636.29698: done getting the remaining hosts for this loop 28023 1726853636.29701: getting the next task for host managed_node3 28023 1726853636.29705: done getting next task for host managed_node3 28023 1726853636.29707: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28023 1726853636.29711: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853636.29720: getting variables 28023 1726853636.29721: in VariableManager get_vars() 28023 1726853636.29736: Calling all_inventory to load vars for managed_node3 28023 1726853636.29743: Calling groups_inventory to load vars for managed_node3 28023 1726853636.29746: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853636.29751: Calling all_plugins_play to load vars for managed_node3 28023 1726853636.29753: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853636.29756: Calling groups_plugins_play to load vars for managed_node3 28023 1726853636.30998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853636.32653: done with get_vars() 28023 1726853636.32682: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:33:56 -0400 (0:00:00.093) 0:00:28.411 ****** 28023 1726853636.32754: entering _queue_task() for managed_node3/setup 28023 1726853636.33122: worker is 1 (out of 1 available) 28023 1726853636.33135: exiting _queue_task() for managed_node3/setup 28023 1726853636.33147: done queuing things up, now waiting for results queue to drain 28023 1726853636.33148: waiting for pending results... 28023 1726853636.33464: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28023 1726853636.33611: in run() - task 02083763-bbaf-fdb6-dad7-0000000006a7 28023 1726853636.33625: variable 'ansible_search_path' from source: unknown 28023 1726853636.33629: variable 'ansible_search_path' from source: unknown 28023 1726853636.33676: calling self._execute() 28023 1726853636.33780: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853636.33786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853636.33794: variable 'omit' from source: magic vars 28023 1726853636.34185: variable 'ansible_distribution_major_version' from source: facts 28023 1726853636.34376: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853636.34432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853636.36625: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853636.36701: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853636.36738: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853636.36777: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853636.36811: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853636.36893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853636.36930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853636.36956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853636.36997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853636.37018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853636.37075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853636.37276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853636.37280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853636.37282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853636.37285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853636.37475: variable '__network_required_facts' from source: role '' defaults 28023 1726853636.37479: variable 'ansible_facts' from source: unknown 28023 1726853636.38127: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28023 1726853636.38130: when evaluation is False, skipping this task 28023 1726853636.38133: _execute() done 28023 1726853636.38136: dumping result to json 28023 1726853636.38138: done dumping result, returning 28023 1726853636.38147: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-fdb6-dad7-0000000006a7] 28023 1726853636.38150: sending task result for task 02083763-bbaf-fdb6-dad7-0000000006a7 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28023 1726853636.38288: no more pending results, returning what we have 28023 1726853636.38293: results queue empty 28023 1726853636.38294: checking for any_errors_fatal 28023 1726853636.38296: done checking for any_errors_fatal 28023 1726853636.38296: checking for max_fail_percentage 28023 1726853636.38298: done checking for max_fail_percentage 28023 1726853636.38299: checking to see if all hosts have failed and the running result is not ok 28023 1726853636.38300: done checking to see if all hosts have failed 28023 1726853636.38301: getting the remaining hosts for this loop 28023 1726853636.38303: done getting the remaining hosts for this loop 28023 1726853636.38306: getting the next task for host managed_node3 28023 1726853636.38319: done getting next task for host managed_node3 28023 1726853636.38323: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28023 1726853636.38329: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853636.38344: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000006a7 28023 1726853636.38347: WORKER PROCESS EXITING 28023 1726853636.38468: getting variables 28023 1726853636.38470: in VariableManager get_vars() 28023 1726853636.38516: Calling all_inventory to load vars for managed_node3 28023 1726853636.38520: Calling groups_inventory to load vars for managed_node3 28023 1726853636.38522: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853636.38535: Calling all_plugins_play to load vars for managed_node3 28023 1726853636.38539: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853636.38542: Calling groups_plugins_play to load vars for managed_node3 28023 1726853636.40102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853636.41762: done with get_vars() 28023 1726853636.41791: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:33:56 -0400 (0:00:00.091) 0:00:28.502 ****** 28023 1726853636.41894: entering _queue_task() for managed_node3/stat 28023 1726853636.42199: worker is 1 (out of 1 available) 28023 1726853636.42211: exiting _queue_task() for managed_node3/stat 28023 1726853636.42336: done queuing things up, now waiting for results queue to drain 28023 1726853636.42338: waiting for pending results... 28023 1726853636.42526: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 28023 1726853636.42686: in run() - task 02083763-bbaf-fdb6-dad7-0000000006a9 28023 1726853636.42700: variable 'ansible_search_path' from source: unknown 28023 1726853636.42704: variable 'ansible_search_path' from source: unknown 28023 1726853636.42736: calling self._execute() 28023 1726853636.42840: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853636.42845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853636.42855: variable 'omit' from source: magic vars 28023 1726853636.43251: variable 'ansible_distribution_major_version' from source: facts 28023 1726853636.43377: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853636.43438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853636.43712: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853636.43759: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853636.43795: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853636.43826: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853636.43916: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853636.43938: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853636.43973: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853636.43998: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853636.44089: variable '__network_is_ostree' from source: set_fact 28023 1726853636.44096: Evaluated conditional (not __network_is_ostree is defined): False 28023 1726853636.44099: when evaluation is False, skipping this task 28023 1726853636.44101: _execute() done 28023 1726853636.44105: dumping result to json 28023 1726853636.44107: done dumping result, returning 28023 1726853636.44115: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-fdb6-dad7-0000000006a9] 28023 1726853636.44120: sending task result for task 02083763-bbaf-fdb6-dad7-0000000006a9 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28023 1726853636.44251: no more pending results, returning what we have 28023 1726853636.44254: results queue empty 28023 1726853636.44255: checking for any_errors_fatal 28023 1726853636.44266: done checking for any_errors_fatal 28023 1726853636.44266: checking for max_fail_percentage 28023 1726853636.44268: done checking for max_fail_percentage 28023 1726853636.44269: checking to see if all hosts have failed and the running result is not ok 28023 1726853636.44270: done checking to see if all hosts have failed 28023 1726853636.44374: getting the remaining hosts for this loop 28023 1726853636.44377: done getting the remaining hosts for this loop 28023 1726853636.44381: getting the next task for host managed_node3 28023 1726853636.44388: done getting next task for host managed_node3 28023 1726853636.44391: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28023 1726853636.44398: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853636.44413: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000006a9 28023 1726853636.44418: WORKER PROCESS EXITING 28023 1726853636.44430: getting variables 28023 1726853636.44431: in VariableManager get_vars() 28023 1726853636.44477: Calling all_inventory to load vars for managed_node3 28023 1726853636.44480: Calling groups_inventory to load vars for managed_node3 28023 1726853636.44483: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853636.44494: Calling all_plugins_play to load vars for managed_node3 28023 1726853636.44497: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853636.44499: Calling groups_plugins_play to load vars for managed_node3 28023 1726853636.46150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853636.47766: done with get_vars() 28023 1726853636.47790: done getting variables 28023 1726853636.47844: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:33:56 -0400 (0:00:00.059) 0:00:28.562 ****** 28023 1726853636.47887: entering _queue_task() for managed_node3/set_fact 28023 1726853636.48406: worker is 1 (out of 1 available) 28023 1726853636.48417: exiting _queue_task() for managed_node3/set_fact 28023 1726853636.48428: done queuing things up, now waiting for results queue to drain 28023 1726853636.48429: waiting for pending results... 28023 1726853636.48568: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28023 1726853636.48730: in run() - task 02083763-bbaf-fdb6-dad7-0000000006aa 28023 1726853636.48747: variable 'ansible_search_path' from source: unknown 28023 1726853636.48752: variable 'ansible_search_path' from source: unknown 28023 1726853636.48793: calling self._execute() 28023 1726853636.48897: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853636.48903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853636.48912: variable 'omit' from source: magic vars 28023 1726853636.49376: variable 'ansible_distribution_major_version' from source: facts 28023 1726853636.49380: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853636.49499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853636.49785: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853636.49826: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853636.49869: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853636.49904: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853636.49993: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853636.50015: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853636.50067: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853636.50076: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853636.50192: variable '__network_is_ostree' from source: set_fact 28023 1726853636.50195: Evaluated conditional (not __network_is_ostree is defined): False 28023 1726853636.50198: when evaluation is False, skipping this task 28023 1726853636.50200: _execute() done 28023 1726853636.50203: dumping result to json 28023 1726853636.50205: done dumping result, returning 28023 1726853636.50208: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-fdb6-dad7-0000000006aa] 28023 1726853636.50210: sending task result for task 02083763-bbaf-fdb6-dad7-0000000006aa 28023 1726853636.50353: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000006aa 28023 1726853636.50356: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28023 1726853636.50434: no more pending results, returning what we have 28023 1726853636.50437: results queue empty 28023 1726853636.50438: checking for any_errors_fatal 28023 1726853636.50443: done checking for any_errors_fatal 28023 1726853636.50444: checking for max_fail_percentage 28023 1726853636.50446: done checking for max_fail_percentage 28023 1726853636.50446: checking to see if all hosts have failed and the running result is not ok 28023 1726853636.50448: done checking to see if all hosts have failed 28023 1726853636.50448: getting the remaining hosts for this loop 28023 1726853636.50450: done getting the remaining hosts for this loop 28023 1726853636.50453: getting the next task for host managed_node3 28023 1726853636.50464: done getting next task for host managed_node3 28023 1726853636.50467: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28023 1726853636.50474: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853636.50493: getting variables 28023 1726853636.50495: in VariableManager get_vars() 28023 1726853636.50617: Calling all_inventory to load vars for managed_node3 28023 1726853636.50620: Calling groups_inventory to load vars for managed_node3 28023 1726853636.50623: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853636.50634: Calling all_plugins_play to load vars for managed_node3 28023 1726853636.50637: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853636.50640: Calling groups_plugins_play to load vars for managed_node3 28023 1726853636.52097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853636.53933: done with get_vars() 28023 1726853636.53961: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:33:56 -0400 (0:00:00.061) 0:00:28.624 ****** 28023 1726853636.54046: entering _queue_task() for managed_node3/service_facts 28023 1726853636.54392: worker is 1 (out of 1 available) 28023 1726853636.54408: exiting _queue_task() for managed_node3/service_facts 28023 1726853636.54423: done queuing things up, now waiting for results queue to drain 28023 1726853636.54424: waiting for pending results... 28023 1726853636.54816: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 28023 1726853636.54902: in run() - task 02083763-bbaf-fdb6-dad7-0000000006ac 28023 1726853636.54918: variable 'ansible_search_path' from source: unknown 28023 1726853636.54922: variable 'ansible_search_path' from source: unknown 28023 1726853636.54956: calling self._execute() 28023 1726853636.55129: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853636.55133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853636.55136: variable 'omit' from source: magic vars 28023 1726853636.55495: variable 'ansible_distribution_major_version' from source: facts 28023 1726853636.55507: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853636.55513: variable 'omit' from source: magic vars 28023 1726853636.55607: variable 'omit' from source: magic vars 28023 1726853636.55639: variable 'omit' from source: magic vars 28023 1726853636.55690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853636.55778: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853636.55781: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853636.55784: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853636.55786: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853636.55817: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853636.55820: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853636.55824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853636.55933: Set connection var ansible_shell_type to sh 28023 1726853636.55941: Set connection var ansible_shell_executable to /bin/sh 28023 1726853636.55947: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853636.55953: Set connection var ansible_connection to ssh 28023 1726853636.55961: Set connection var ansible_pipelining to False 28023 1726853636.55993: Set connection var ansible_timeout to 10 28023 1726853636.56001: variable 'ansible_shell_executable' from source: unknown 28023 1726853636.56004: variable 'ansible_connection' from source: unknown 28023 1726853636.56007: variable 'ansible_module_compression' from source: unknown 28023 1726853636.56009: variable 'ansible_shell_type' from source: unknown 28023 1726853636.56012: variable 'ansible_shell_executable' from source: unknown 28023 1726853636.56028: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853636.56030: variable 'ansible_pipelining' from source: unknown 28023 1726853636.56032: variable 'ansible_timeout' from source: unknown 28023 1726853636.56035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853636.56249: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28023 1726853636.56254: variable 'omit' from source: magic vars 28023 1726853636.56256: starting attempt loop 28023 1726853636.56258: running the handler 28023 1726853636.56476: _low_level_execute_command(): starting 28023 1726853636.56480: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853636.57176: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853636.57181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853636.57184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853636.57187: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853636.57303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853636.59041: stdout chunk (state=3): >>>/root <<< 28023 1726853636.59194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853636.59198: stdout chunk (state=3): >>><<< 28023 1726853636.59210: stderr chunk (state=3): >>><<< 28023 1726853636.59294: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853636.59309: _low_level_execute_command(): starting 28023 1726853636.59317: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853636.5929382-29350-238749628216676 `" && echo ansible-tmp-1726853636.5929382-29350-238749628216676="` echo /root/.ansible/tmp/ansible-tmp-1726853636.5929382-29350-238749628216676 `" ) && sleep 0' 28023 1726853636.60553: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853636.60575: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853636.60580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853636.60595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 28023 1726853636.60608: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853636.60622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853636.60628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853636.60838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853636.60874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853636.62836: stdout chunk (state=3): >>>ansible-tmp-1726853636.5929382-29350-238749628216676=/root/.ansible/tmp/ansible-tmp-1726853636.5929382-29350-238749628216676 <<< 28023 1726853636.63011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853636.63014: stdout chunk (state=3): >>><<< 28023 1726853636.63017: stderr chunk (state=3): >>><<< 28023 1726853636.63045: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853636.5929382-29350-238749628216676=/root/.ansible/tmp/ansible-tmp-1726853636.5929382-29350-238749628216676 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853636.63208: variable 'ansible_module_compression' from source: unknown 28023 1726853636.63211: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 28023 1726853636.63229: variable 'ansible_facts' from source: unknown 28023 1726853636.63473: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853636.5929382-29350-238749628216676/AnsiballZ_service_facts.py 28023 1726853636.63707: Sending initial data 28023 1726853636.63716: Sent initial data (162 bytes) 28023 1726853636.64341: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853636.64377: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853636.64469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853636.66120: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853636.66179: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853636.66261: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmp_96xsl9h /root/.ansible/tmp/ansible-tmp-1726853636.5929382-29350-238749628216676/AnsiballZ_service_facts.py <<< 28023 1726853636.66264: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853636.5929382-29350-238749628216676/AnsiballZ_service_facts.py" <<< 28023 1726853636.66314: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmp_96xsl9h" to remote "/root/.ansible/tmp/ansible-tmp-1726853636.5929382-29350-238749628216676/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853636.5929382-29350-238749628216676/AnsiballZ_service_facts.py" <<< 28023 1726853636.67190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853636.67219: stderr chunk (state=3): >>><<< 28023 1726853636.67249: stdout chunk (state=3): >>><<< 28023 1726853636.67259: done transferring module to remote 28023 1726853636.67278: _low_level_execute_command(): starting 28023 1726853636.67355: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853636.5929382-29350-238749628216676/ /root/.ansible/tmp/ansible-tmp-1726853636.5929382-29350-238749628216676/AnsiballZ_service_facts.py && sleep 0' 28023 1726853636.67921: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853636.67987: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853636.68046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853636.68065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853636.68091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853636.68179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853636.70037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853636.70039: stdout chunk (state=3): >>><<< 28023 1726853636.70041: stderr chunk (state=3): >>><<< 28023 1726853636.70076: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853636.70079: _low_level_execute_command(): starting 28023 1726853636.70082: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853636.5929382-29350-238749628216676/AnsiballZ_service_facts.py && sleep 0' 28023 1726853636.70697: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853636.70701: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853636.70704: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853636.70764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853638.31322: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 28023 1726853638.31330: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 28023 1726853638.31356: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 28023 1726853638.31373: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integratio<<< 28023 1726853638.31395: stdout chunk (state=3): >>>n.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28023 1726853638.32950: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853638.32981: stderr chunk (state=3): >>><<< 28023 1726853638.32984: stdout chunk (state=3): >>><<< 28023 1726853638.33014: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853638.33447: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853636.5929382-29350-238749628216676/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853638.33454: _low_level_execute_command(): starting 28023 1726853638.33466: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853636.5929382-29350-238749628216676/ > /dev/null 2>&1 && sleep 0' 28023 1726853638.33914: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853638.33918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853638.33920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853638.33922: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853638.33924: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853638.33975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853638.33981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853638.34040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853638.35901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853638.35924: stderr chunk (state=3): >>><<< 28023 1726853638.35928: stdout chunk (state=3): >>><<< 28023 1726853638.35939: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853638.35945: handler run complete 28023 1726853638.36063: variable 'ansible_facts' from source: unknown 28023 1726853638.36158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853638.36441: variable 'ansible_facts' from source: unknown 28023 1726853638.36522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853638.36637: attempt loop complete, returning result 28023 1726853638.36641: _execute() done 28023 1726853638.36645: dumping result to json 28023 1726853638.36685: done dumping result, returning 28023 1726853638.36693: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-fdb6-dad7-0000000006ac] 28023 1726853638.36698: sending task result for task 02083763-bbaf-fdb6-dad7-0000000006ac ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28023 1726853638.37305: no more pending results, returning what we have 28023 1726853638.37308: results queue empty 28023 1726853638.37309: checking for any_errors_fatal 28023 1726853638.37313: done checking for any_errors_fatal 28023 1726853638.37313: checking for max_fail_percentage 28023 1726853638.37315: done checking for max_fail_percentage 28023 1726853638.37315: checking to see if all hosts have failed and the running result is not ok 28023 1726853638.37316: done checking to see if all hosts have failed 28023 1726853638.37317: getting the remaining hosts for this loop 28023 1726853638.37318: done getting the remaining hosts for this loop 28023 1726853638.37321: getting the next task for host managed_node3 28023 1726853638.37326: done getting next task for host managed_node3 28023 1726853638.37329: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28023 1726853638.37334: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853638.37344: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000006ac 28023 1726853638.37348: WORKER PROCESS EXITING 28023 1726853638.37354: getting variables 28023 1726853638.37355: in VariableManager get_vars() 28023 1726853638.37384: Calling all_inventory to load vars for managed_node3 28023 1726853638.37386: Calling groups_inventory to load vars for managed_node3 28023 1726853638.37388: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853638.37394: Calling all_plugins_play to load vars for managed_node3 28023 1726853638.37396: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853638.37398: Calling groups_plugins_play to load vars for managed_node3 28023 1726853638.38506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853638.40100: done with get_vars() 28023 1726853638.40120: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:33:58 -0400 (0:00:01.861) 0:00:30.486 ****** 28023 1726853638.40218: entering _queue_task() for managed_node3/package_facts 28023 1726853638.40552: worker is 1 (out of 1 available) 28023 1726853638.40566: exiting _queue_task() for managed_node3/package_facts 28023 1726853638.40582: done queuing things up, now waiting for results queue to drain 28023 1726853638.40583: waiting for pending results... 28023 1726853638.40773: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 28023 1726853638.40880: in run() - task 02083763-bbaf-fdb6-dad7-0000000006ad 28023 1726853638.40894: variable 'ansible_search_path' from source: unknown 28023 1726853638.40899: variable 'ansible_search_path' from source: unknown 28023 1726853638.40926: calling self._execute() 28023 1726853638.41003: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853638.41008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853638.41016: variable 'omit' from source: magic vars 28023 1726853638.41293: variable 'ansible_distribution_major_version' from source: facts 28023 1726853638.41303: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853638.41309: variable 'omit' from source: magic vars 28023 1726853638.41352: variable 'omit' from source: magic vars 28023 1726853638.41381: variable 'omit' from source: magic vars 28023 1726853638.41411: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853638.41437: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853638.41455: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853638.41473: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853638.41484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853638.41507: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853638.41510: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853638.41513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853638.41583: Set connection var ansible_shell_type to sh 28023 1726853638.41592: Set connection var ansible_shell_executable to /bin/sh 28023 1726853638.41595: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853638.41600: Set connection var ansible_connection to ssh 28023 1726853638.41606: Set connection var ansible_pipelining to False 28023 1726853638.41612: Set connection var ansible_timeout to 10 28023 1726853638.41631: variable 'ansible_shell_executable' from source: unknown 28023 1726853638.41633: variable 'ansible_connection' from source: unknown 28023 1726853638.41636: variable 'ansible_module_compression' from source: unknown 28023 1726853638.41638: variable 'ansible_shell_type' from source: unknown 28023 1726853638.41641: variable 'ansible_shell_executable' from source: unknown 28023 1726853638.41643: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853638.41647: variable 'ansible_pipelining' from source: unknown 28023 1726853638.41649: variable 'ansible_timeout' from source: unknown 28023 1726853638.41653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853638.41794: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28023 1726853638.41804: variable 'omit' from source: magic vars 28023 1726853638.41807: starting attempt loop 28023 1726853638.41810: running the handler 28023 1726853638.41824: _low_level_execute_command(): starting 28023 1726853638.41830: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853638.42554: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853638.42590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853638.44290: stdout chunk (state=3): >>>/root <<< 28023 1726853638.44386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853638.44408: stderr chunk (state=3): >>><<< 28023 1726853638.44412: stdout chunk (state=3): >>><<< 28023 1726853638.44430: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853638.44441: _low_level_execute_command(): starting 28023 1726853638.44447: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853638.4442976-29400-16012125523461 `" && echo ansible-tmp-1726853638.4442976-29400-16012125523461="` echo /root/.ansible/tmp/ansible-tmp-1726853638.4442976-29400-16012125523461 `" ) && sleep 0' 28023 1726853638.44885: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853638.44889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853638.44891: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853638.44893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853638.44903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853638.45029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853638.45046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853638.46953: stdout chunk (state=3): >>>ansible-tmp-1726853638.4442976-29400-16012125523461=/root/.ansible/tmp/ansible-tmp-1726853638.4442976-29400-16012125523461 <<< 28023 1726853638.47068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853638.47089: stderr chunk (state=3): >>><<< 28023 1726853638.47092: stdout chunk (state=3): >>><<< 28023 1726853638.47107: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853638.4442976-29400-16012125523461=/root/.ansible/tmp/ansible-tmp-1726853638.4442976-29400-16012125523461 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853638.47144: variable 'ansible_module_compression' from source: unknown 28023 1726853638.47184: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 28023 1726853638.47231: variable 'ansible_facts' from source: unknown 28023 1726853638.47460: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853638.4442976-29400-16012125523461/AnsiballZ_package_facts.py 28023 1726853638.47716: Sending initial data 28023 1726853638.47723: Sent initial data (161 bytes) 28023 1726853638.48188: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853638.48201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853638.48223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853638.48233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853638.48307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853638.48366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853638.49999: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853638.50048: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853638.50137: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpcb07tnr4 /root/.ansible/tmp/ansible-tmp-1726853638.4442976-29400-16012125523461/AnsiballZ_package_facts.py <<< 28023 1726853638.50140: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853638.4442976-29400-16012125523461/AnsiballZ_package_facts.py" <<< 28023 1726853638.50205: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpcb07tnr4" to remote "/root/.ansible/tmp/ansible-tmp-1726853638.4442976-29400-16012125523461/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853638.4442976-29400-16012125523461/AnsiballZ_package_facts.py" <<< 28023 1726853638.51738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853638.51780: stderr chunk (state=3): >>><<< 28023 1726853638.51783: stdout chunk (state=3): >>><<< 28023 1726853638.51799: done transferring module to remote 28023 1726853638.51809: _low_level_execute_command(): starting 28023 1726853638.51813: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853638.4442976-29400-16012125523461/ /root/.ansible/tmp/ansible-tmp-1726853638.4442976-29400-16012125523461/AnsiballZ_package_facts.py && sleep 0' 28023 1726853638.52334: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853638.52349: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853638.52375: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853638.52459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853638.54380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853638.54384: stdout chunk (state=3): >>><<< 28023 1726853638.54482: stderr chunk (state=3): >>><<< 28023 1726853638.54486: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853638.54488: _low_level_execute_command(): starting 28023 1726853638.54490: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853638.4442976-29400-16012125523461/AnsiballZ_package_facts.py && sleep 0' 28023 1726853638.55036: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853638.55049: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853638.55062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853638.55123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853638.55196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853638.55213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853638.55244: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853638.55343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853639.00032: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 28023 1726853639.00053: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 28023 1726853639.00073: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 28023 1726853639.00109: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 28023 1726853639.00123: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 28023 1726853639.00153: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 28023 1726853639.00168: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 28023 1726853639.00174: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 28023 1726853639.00185: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 28023 1726853639.00208: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 28023 1726853639.00234: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 28023 1726853639.00240: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28023 1726853639.02048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853639.02078: stderr chunk (state=3): >>><<< 28023 1726853639.02081: stdout chunk (state=3): >>><<< 28023 1726853639.02128: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853639.03664: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853638.4442976-29400-16012125523461/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853639.03668: _low_level_execute_command(): starting 28023 1726853639.03670: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853638.4442976-29400-16012125523461/ > /dev/null 2>&1 && sleep 0' 28023 1726853639.04176: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853639.04190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853639.04201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853639.04218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853639.04232: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853639.04245: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853639.04258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853639.04291: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853639.04307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853639.04385: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853639.04405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853639.04435: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853639.04514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853639.06480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853639.06492: stdout chunk (state=3): >>><<< 28023 1726853639.06504: stderr chunk (state=3): >>><<< 28023 1726853639.06530: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853639.06547: handler run complete 28023 1726853639.07532: variable 'ansible_facts' from source: unknown 28023 1726853639.08015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853639.09638: variable 'ansible_facts' from source: unknown 28023 1726853639.09875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853639.10252: attempt loop complete, returning result 28023 1726853639.10262: _execute() done 28023 1726853639.10265: dumping result to json 28023 1726853639.10379: done dumping result, returning 28023 1726853639.10387: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-fdb6-dad7-0000000006ad] 28023 1726853639.10390: sending task result for task 02083763-bbaf-fdb6-dad7-0000000006ad 28023 1726853639.12147: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000006ad 28023 1726853639.12151: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28023 1726853639.12241: no more pending results, returning what we have 28023 1726853639.12243: results queue empty 28023 1726853639.12244: checking for any_errors_fatal 28023 1726853639.12249: done checking for any_errors_fatal 28023 1726853639.12250: checking for max_fail_percentage 28023 1726853639.12251: done checking for max_fail_percentage 28023 1726853639.12251: checking to see if all hosts have failed and the running result is not ok 28023 1726853639.12252: done checking to see if all hosts have failed 28023 1726853639.12253: getting the remaining hosts for this loop 28023 1726853639.12253: done getting the remaining hosts for this loop 28023 1726853639.12256: getting the next task for host managed_node3 28023 1726853639.12263: done getting next task for host managed_node3 28023 1726853639.12266: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28023 1726853639.12269: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853639.12278: getting variables 28023 1726853639.12279: in VariableManager get_vars() 28023 1726853639.12304: Calling all_inventory to load vars for managed_node3 28023 1726853639.12306: Calling groups_inventory to load vars for managed_node3 28023 1726853639.12307: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853639.12314: Calling all_plugins_play to load vars for managed_node3 28023 1726853639.12315: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853639.12318: Calling groups_plugins_play to load vars for managed_node3 28023 1726853639.13000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853639.13958: done with get_vars() 28023 1726853639.13987: done getting variables 28023 1726853639.14051: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:33:59 -0400 (0:00:00.738) 0:00:31.224 ****** 28023 1726853639.14093: entering _queue_task() for managed_node3/debug 28023 1726853639.14539: worker is 1 (out of 1 available) 28023 1726853639.14552: exiting _queue_task() for managed_node3/debug 28023 1726853639.14564: done queuing things up, now waiting for results queue to drain 28023 1726853639.14565: waiting for pending results... 28023 1726853639.14807: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 28023 1726853639.14922: in run() - task 02083763-bbaf-fdb6-dad7-000000000642 28023 1726853639.14944: variable 'ansible_search_path' from source: unknown 28023 1726853639.15011: variable 'ansible_search_path' from source: unknown 28023 1726853639.15015: calling self._execute() 28023 1726853639.15114: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853639.15125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853639.15131: variable 'omit' from source: magic vars 28023 1726853639.15423: variable 'ansible_distribution_major_version' from source: facts 28023 1726853639.15433: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853639.15440: variable 'omit' from source: magic vars 28023 1726853639.15480: variable 'omit' from source: magic vars 28023 1726853639.15546: variable 'network_provider' from source: set_fact 28023 1726853639.15565: variable 'omit' from source: magic vars 28023 1726853639.15597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853639.15627: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853639.15643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853639.15657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853639.15674: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853639.15696: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853639.15700: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853639.15703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853639.15776: Set connection var ansible_shell_type to sh 28023 1726853639.15779: Set connection var ansible_shell_executable to /bin/sh 28023 1726853639.15829: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853639.15832: Set connection var ansible_connection to ssh 28023 1726853639.15835: Set connection var ansible_pipelining to False 28023 1726853639.15838: Set connection var ansible_timeout to 10 28023 1726853639.15840: variable 'ansible_shell_executable' from source: unknown 28023 1726853639.15843: variable 'ansible_connection' from source: unknown 28023 1726853639.15845: variable 'ansible_module_compression' from source: unknown 28023 1726853639.15847: variable 'ansible_shell_type' from source: unknown 28023 1726853639.15849: variable 'ansible_shell_executable' from source: unknown 28023 1726853639.15851: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853639.15853: variable 'ansible_pipelining' from source: unknown 28023 1726853639.15856: variable 'ansible_timeout' from source: unknown 28023 1726853639.15858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853639.15932: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853639.15941: variable 'omit' from source: magic vars 28023 1726853639.15946: starting attempt loop 28023 1726853639.15950: running the handler 28023 1726853639.15993: handler run complete 28023 1726853639.16002: attempt loop complete, returning result 28023 1726853639.16006: _execute() done 28023 1726853639.16008: dumping result to json 28023 1726853639.16011: done dumping result, returning 28023 1726853639.16017: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-fdb6-dad7-000000000642] 28023 1726853639.16020: sending task result for task 02083763-bbaf-fdb6-dad7-000000000642 28023 1726853639.16100: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000642 28023 1726853639.16104: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 28023 1726853639.16166: no more pending results, returning what we have 28023 1726853639.16169: results queue empty 28023 1726853639.16170: checking for any_errors_fatal 28023 1726853639.16182: done checking for any_errors_fatal 28023 1726853639.16182: checking for max_fail_percentage 28023 1726853639.16184: done checking for max_fail_percentage 28023 1726853639.16185: checking to see if all hosts have failed and the running result is not ok 28023 1726853639.16186: done checking to see if all hosts have failed 28023 1726853639.16186: getting the remaining hosts for this loop 28023 1726853639.16188: done getting the remaining hosts for this loop 28023 1726853639.16192: getting the next task for host managed_node3 28023 1726853639.16198: done getting next task for host managed_node3 28023 1726853639.16202: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28023 1726853639.16206: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853639.16224: getting variables 28023 1726853639.16226: in VariableManager get_vars() 28023 1726853639.16263: Calling all_inventory to load vars for managed_node3 28023 1726853639.16265: Calling groups_inventory to load vars for managed_node3 28023 1726853639.16267: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853639.16277: Calling all_plugins_play to load vars for managed_node3 28023 1726853639.16280: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853639.16282: Calling groups_plugins_play to load vars for managed_node3 28023 1726853639.17456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853639.18853: done with get_vars() 28023 1726853639.18877: done getting variables 28023 1726853639.18930: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:33:59 -0400 (0:00:00.048) 0:00:31.273 ****** 28023 1726853639.18962: entering _queue_task() for managed_node3/fail 28023 1726853639.19256: worker is 1 (out of 1 available) 28023 1726853639.19268: exiting _queue_task() for managed_node3/fail 28023 1726853639.19482: done queuing things up, now waiting for results queue to drain 28023 1726853639.19484: waiting for pending results... 28023 1726853639.19689: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28023 1726853639.19710: in run() - task 02083763-bbaf-fdb6-dad7-000000000643 28023 1726853639.19731: variable 'ansible_search_path' from source: unknown 28023 1726853639.19739: variable 'ansible_search_path' from source: unknown 28023 1726853639.19780: calling self._execute() 28023 1726853639.19885: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853639.19897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853639.19910: variable 'omit' from source: magic vars 28023 1726853639.20287: variable 'ansible_distribution_major_version' from source: facts 28023 1726853639.20361: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853639.20428: variable 'network_state' from source: role '' defaults 28023 1726853639.20443: Evaluated conditional (network_state != {}): False 28023 1726853639.20451: when evaluation is False, skipping this task 28023 1726853639.20457: _execute() done 28023 1726853639.20478: dumping result to json 28023 1726853639.20492: done dumping result, returning 28023 1726853639.20504: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-fdb6-dad7-000000000643] 28023 1726853639.20514: sending task result for task 02083763-bbaf-fdb6-dad7-000000000643 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28023 1726853639.20756: no more pending results, returning what we have 28023 1726853639.20759: results queue empty 28023 1726853639.20760: checking for any_errors_fatal 28023 1726853639.20767: done checking for any_errors_fatal 28023 1726853639.20768: checking for max_fail_percentage 28023 1726853639.20769: done checking for max_fail_percentage 28023 1726853639.20770: checking to see if all hosts have failed and the running result is not ok 28023 1726853639.20774: done checking to see if all hosts have failed 28023 1726853639.20774: getting the remaining hosts for this loop 28023 1726853639.20776: done getting the remaining hosts for this loop 28023 1726853639.20780: getting the next task for host managed_node3 28023 1726853639.20787: done getting next task for host managed_node3 28023 1726853639.20792: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28023 1726853639.20796: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853639.20820: getting variables 28023 1726853639.20822: in VariableManager get_vars() 28023 1726853639.20865: Calling all_inventory to load vars for managed_node3 28023 1726853639.20869: Calling groups_inventory to load vars for managed_node3 28023 1726853639.21042: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853639.21049: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000643 28023 1726853639.21052: WORKER PROCESS EXITING 28023 1726853639.21061: Calling all_plugins_play to load vars for managed_node3 28023 1726853639.21064: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853639.21067: Calling groups_plugins_play to load vars for managed_node3 28023 1726853639.22294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853639.23639: done with get_vars() 28023 1726853639.23655: done getting variables 28023 1726853639.23699: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:33:59 -0400 (0:00:00.047) 0:00:31.321 ****** 28023 1726853639.23722: entering _queue_task() for managed_node3/fail 28023 1726853639.23944: worker is 1 (out of 1 available) 28023 1726853639.23960: exiting _queue_task() for managed_node3/fail 28023 1726853639.23975: done queuing things up, now waiting for results queue to drain 28023 1726853639.23976: waiting for pending results... 28023 1726853639.24162: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28023 1726853639.24260: in run() - task 02083763-bbaf-fdb6-dad7-000000000644 28023 1726853639.24272: variable 'ansible_search_path' from source: unknown 28023 1726853639.24276: variable 'ansible_search_path' from source: unknown 28023 1726853639.24306: calling self._execute() 28023 1726853639.24384: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853639.24388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853639.24392: variable 'omit' from source: magic vars 28023 1726853639.24667: variable 'ansible_distribution_major_version' from source: facts 28023 1726853639.24679: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853639.24761: variable 'network_state' from source: role '' defaults 28023 1726853639.24769: Evaluated conditional (network_state != {}): False 28023 1726853639.24775: when evaluation is False, skipping this task 28023 1726853639.24778: _execute() done 28023 1726853639.24781: dumping result to json 28023 1726853639.24783: done dumping result, returning 28023 1726853639.24787: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-fdb6-dad7-000000000644] 28023 1726853639.24793: sending task result for task 02083763-bbaf-fdb6-dad7-000000000644 28023 1726853639.24879: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000644 28023 1726853639.24882: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28023 1726853639.24925: no more pending results, returning what we have 28023 1726853639.24929: results queue empty 28023 1726853639.24930: checking for any_errors_fatal 28023 1726853639.24937: done checking for any_errors_fatal 28023 1726853639.24938: checking for max_fail_percentage 28023 1726853639.24939: done checking for max_fail_percentage 28023 1726853639.24939: checking to see if all hosts have failed and the running result is not ok 28023 1726853639.24941: done checking to see if all hosts have failed 28023 1726853639.24941: getting the remaining hosts for this loop 28023 1726853639.24943: done getting the remaining hosts for this loop 28023 1726853639.24946: getting the next task for host managed_node3 28023 1726853639.24952: done getting next task for host managed_node3 28023 1726853639.24956: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28023 1726853639.24962: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853639.24982: getting variables 28023 1726853639.24983: in VariableManager get_vars() 28023 1726853639.25018: Calling all_inventory to load vars for managed_node3 28023 1726853639.25021: Calling groups_inventory to load vars for managed_node3 28023 1726853639.25023: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853639.25031: Calling all_plugins_play to load vars for managed_node3 28023 1726853639.25034: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853639.25037: Calling groups_plugins_play to load vars for managed_node3 28023 1726853639.26246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853639.27815: done with get_vars() 28023 1726853639.27840: done getting variables 28023 1726853639.27901: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:33:59 -0400 (0:00:00.042) 0:00:31.363 ****** 28023 1726853639.27934: entering _queue_task() for managed_node3/fail 28023 1726853639.28252: worker is 1 (out of 1 available) 28023 1726853639.28266: exiting _queue_task() for managed_node3/fail 28023 1726853639.28383: done queuing things up, now waiting for results queue to drain 28023 1726853639.28385: waiting for pending results... 28023 1726853639.28621: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28023 1726853639.28877: in run() - task 02083763-bbaf-fdb6-dad7-000000000645 28023 1726853639.28882: variable 'ansible_search_path' from source: unknown 28023 1726853639.28885: variable 'ansible_search_path' from source: unknown 28023 1726853639.28888: calling self._execute() 28023 1726853639.28891: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853639.28897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853639.28913: variable 'omit' from source: magic vars 28023 1726853639.29286: variable 'ansible_distribution_major_version' from source: facts 28023 1726853639.29303: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853639.29486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853639.31703: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853639.31784: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853639.31824: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853639.31863: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853639.31893: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853639.31969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853639.32002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853639.32048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853639.32073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853639.32092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853639.32188: variable 'ansible_distribution_major_version' from source: facts 28023 1726853639.32267: Evaluated conditional (ansible_distribution_major_version | int > 9): True 28023 1726853639.32320: variable 'ansible_distribution' from source: facts 28023 1726853639.32330: variable '__network_rh_distros' from source: role '' defaults 28023 1726853639.32346: Evaluated conditional (ansible_distribution in __network_rh_distros): True 28023 1726853639.32598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853639.32614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853639.32640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853639.32677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853639.32749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853639.32752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853639.32755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853639.32775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853639.32813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853639.32826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853639.32868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853639.32966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853639.32970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853639.32975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853639.32977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853639.33250: variable 'network_connections' from source: include params 28023 1726853639.33263: variable 'interface0' from source: play vars 28023 1726853639.33329: variable 'interface0' from source: play vars 28023 1726853639.33339: variable 'interface1' from source: play vars 28023 1726853639.33401: variable 'interface1' from source: play vars 28023 1726853639.33404: variable 'network_state' from source: role '' defaults 28023 1726853639.33469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853639.33631: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853639.33664: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853639.33694: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853639.34056: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853639.34075: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853639.34095: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853639.34166: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853639.34169: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853639.34175: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 28023 1726853639.34177: when evaluation is False, skipping this task 28023 1726853639.34180: _execute() done 28023 1726853639.34182: dumping result to json 28023 1726853639.34184: done dumping result, returning 28023 1726853639.34187: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-fdb6-dad7-000000000645] 28023 1726853639.34194: sending task result for task 02083763-bbaf-fdb6-dad7-000000000645 28023 1726853639.34396: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000645 28023 1726853639.34399: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 28023 1726853639.34440: no more pending results, returning what we have 28023 1726853639.34443: results queue empty 28023 1726853639.34444: checking for any_errors_fatal 28023 1726853639.34450: done checking for any_errors_fatal 28023 1726853639.34450: checking for max_fail_percentage 28023 1726853639.34452: done checking for max_fail_percentage 28023 1726853639.34452: checking to see if all hosts have failed and the running result is not ok 28023 1726853639.34453: done checking to see if all hosts have failed 28023 1726853639.34454: getting the remaining hosts for this loop 28023 1726853639.34455: done getting the remaining hosts for this loop 28023 1726853639.34461: getting the next task for host managed_node3 28023 1726853639.34467: done getting next task for host managed_node3 28023 1726853639.34472: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28023 1726853639.34475: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853639.34572: getting variables 28023 1726853639.34574: in VariableManager get_vars() 28023 1726853639.34613: Calling all_inventory to load vars for managed_node3 28023 1726853639.34616: Calling groups_inventory to load vars for managed_node3 28023 1726853639.34618: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853639.34627: Calling all_plugins_play to load vars for managed_node3 28023 1726853639.34630: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853639.34632: Calling groups_plugins_play to load vars for managed_node3 28023 1726853639.36031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853639.37508: done with get_vars() 28023 1726853639.37533: done getting variables 28023 1726853639.37603: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:33:59 -0400 (0:00:00.097) 0:00:31.460 ****** 28023 1726853639.37637: entering _queue_task() for managed_node3/dnf 28023 1726853639.37963: worker is 1 (out of 1 available) 28023 1726853639.38179: exiting _queue_task() for managed_node3/dnf 28023 1726853639.38190: done queuing things up, now waiting for results queue to drain 28023 1726853639.38192: waiting for pending results... 28023 1726853639.38352: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28023 1726853639.38418: in run() - task 02083763-bbaf-fdb6-dad7-000000000646 28023 1726853639.38545: variable 'ansible_search_path' from source: unknown 28023 1726853639.38550: variable 'ansible_search_path' from source: unknown 28023 1726853639.38553: calling self._execute() 28023 1726853639.38575: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853639.38582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853639.38591: variable 'omit' from source: magic vars 28023 1726853639.39178: variable 'ansible_distribution_major_version' from source: facts 28023 1726853639.39182: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853639.39191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853639.41398: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853639.41480: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853639.41522: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853639.41560: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853639.41598: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853639.41683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853639.41715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853639.41744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853639.41794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853639.41813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853639.41928: variable 'ansible_distribution' from source: facts 28023 1726853639.41937: variable 'ansible_distribution_major_version' from source: facts 28023 1726853639.41956: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28023 1726853639.42069: variable '__network_wireless_connections_defined' from source: role '' defaults 28023 1726853639.42205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853639.42238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853639.42268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853639.42314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853639.42337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853639.42380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853639.42408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853639.42443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853639.42488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853639.42508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853639.42554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853639.42583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853639.42610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853639.42655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853639.42678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853639.42851: variable 'network_connections' from source: include params 28023 1726853639.42874: variable 'interface0' from source: play vars 28023 1726853639.42943: variable 'interface0' from source: play vars 28023 1726853639.42957: variable 'interface1' from source: play vars 28023 1726853639.43022: variable 'interface1' from source: play vars 28023 1726853639.43176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853639.43266: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853639.43313: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853639.43346: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853639.43380: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853639.43443: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853639.43470: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853639.43512: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853639.43544: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853639.43596: variable '__network_team_connections_defined' from source: role '' defaults 28023 1726853639.43845: variable 'network_connections' from source: include params 28023 1726853639.43856: variable 'interface0' from source: play vars 28023 1726853639.43951: variable 'interface0' from source: play vars 28023 1726853639.43954: variable 'interface1' from source: play vars 28023 1726853639.43993: variable 'interface1' from source: play vars 28023 1726853639.44022: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28023 1726853639.44068: when evaluation is False, skipping this task 28023 1726853639.44073: _execute() done 28023 1726853639.44075: dumping result to json 28023 1726853639.44077: done dumping result, returning 28023 1726853639.44079: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-fdb6-dad7-000000000646] 28023 1726853639.44080: sending task result for task 02083763-bbaf-fdb6-dad7-000000000646 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28023 1726853639.44328: no more pending results, returning what we have 28023 1726853639.44332: results queue empty 28023 1726853639.44333: checking for any_errors_fatal 28023 1726853639.44342: done checking for any_errors_fatal 28023 1726853639.44343: checking for max_fail_percentage 28023 1726853639.44345: done checking for max_fail_percentage 28023 1726853639.44346: checking to see if all hosts have failed and the running result is not ok 28023 1726853639.44347: done checking to see if all hosts have failed 28023 1726853639.44348: getting the remaining hosts for this loop 28023 1726853639.44350: done getting the remaining hosts for this loop 28023 1726853639.44354: getting the next task for host managed_node3 28023 1726853639.44362: done getting next task for host managed_node3 28023 1726853639.44367: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28023 1726853639.44373: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853639.44398: getting variables 28023 1726853639.44400: in VariableManager get_vars() 28023 1726853639.44450: Calling all_inventory to load vars for managed_node3 28023 1726853639.44454: Calling groups_inventory to load vars for managed_node3 28023 1726853639.44456: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853639.44469: Calling all_plugins_play to load vars for managed_node3 28023 1726853639.44677: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853639.44682: Calling groups_plugins_play to load vars for managed_node3 28023 1726853639.45384: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000646 28023 1726853639.45387: WORKER PROCESS EXITING 28023 1726853639.45991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853639.47524: done with get_vars() 28023 1726853639.47550: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28023 1726853639.47618: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:33:59 -0400 (0:00:00.100) 0:00:31.560 ****** 28023 1726853639.47646: entering _queue_task() for managed_node3/yum 28023 1726853639.47961: worker is 1 (out of 1 available) 28023 1726853639.48177: exiting _queue_task() for managed_node3/yum 28023 1726853639.48189: done queuing things up, now waiting for results queue to drain 28023 1726853639.48190: waiting for pending results... 28023 1726853639.48279: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28023 1726853639.48422: in run() - task 02083763-bbaf-fdb6-dad7-000000000647 28023 1726853639.48441: variable 'ansible_search_path' from source: unknown 28023 1726853639.48448: variable 'ansible_search_path' from source: unknown 28023 1726853639.48491: calling self._execute() 28023 1726853639.48594: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853639.48606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853639.48621: variable 'omit' from source: magic vars 28023 1726853639.48994: variable 'ansible_distribution_major_version' from source: facts 28023 1726853639.49011: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853639.49189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853639.51709: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853639.51776: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853639.51817: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853639.59883: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853639.59919: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853639.59998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853639.60029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853639.60057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853639.60107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853639.60128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853639.60230: variable 'ansible_distribution_major_version' from source: facts 28023 1726853639.60251: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28023 1726853639.60258: when evaluation is False, skipping this task 28023 1726853639.60265: _execute() done 28023 1726853639.60273: dumping result to json 28023 1726853639.60281: done dumping result, returning 28023 1726853639.60295: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-fdb6-dad7-000000000647] 28023 1726853639.60303: sending task result for task 02083763-bbaf-fdb6-dad7-000000000647 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28023 1726853639.60445: no more pending results, returning what we have 28023 1726853639.60449: results queue empty 28023 1726853639.60449: checking for any_errors_fatal 28023 1726853639.60455: done checking for any_errors_fatal 28023 1726853639.60455: checking for max_fail_percentage 28023 1726853639.60457: done checking for max_fail_percentage 28023 1726853639.60458: checking to see if all hosts have failed and the running result is not ok 28023 1726853639.60459: done checking to see if all hosts have failed 28023 1726853639.60459: getting the remaining hosts for this loop 28023 1726853639.60461: done getting the remaining hosts for this loop 28023 1726853639.60465: getting the next task for host managed_node3 28023 1726853639.60474: done getting next task for host managed_node3 28023 1726853639.60478: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28023 1726853639.60482: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853639.60503: getting variables 28023 1726853639.60504: in VariableManager get_vars() 28023 1726853639.60549: Calling all_inventory to load vars for managed_node3 28023 1726853639.60552: Calling groups_inventory to load vars for managed_node3 28023 1726853639.60554: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853639.60565: Calling all_plugins_play to load vars for managed_node3 28023 1726853639.60568: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853639.60778: Calling groups_plugins_play to load vars for managed_node3 28023 1726853639.60791: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000647 28023 1726853639.60794: WORKER PROCESS EXITING 28023 1726853639.69130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853639.72529: done with get_vars() 28023 1726853639.72562: done getting variables 28023 1726853639.72612: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:33:59 -0400 (0:00:00.249) 0:00:31.810 ****** 28023 1726853639.72639: entering _queue_task() for managed_node3/fail 28023 1726853639.73611: worker is 1 (out of 1 available) 28023 1726853639.73624: exiting _queue_task() for managed_node3/fail 28023 1726853639.73637: done queuing things up, now waiting for results queue to drain 28023 1726853639.73639: waiting for pending results... 28023 1726853639.74266: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28023 1726853639.74563: in run() - task 02083763-bbaf-fdb6-dad7-000000000648 28023 1726853639.74632: variable 'ansible_search_path' from source: unknown 28023 1726853639.74644: variable 'ansible_search_path' from source: unknown 28023 1726853639.74691: calling self._execute() 28023 1726853639.74991: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853639.75005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853639.75021: variable 'omit' from source: magic vars 28023 1726853639.76278: variable 'ansible_distribution_major_version' from source: facts 28023 1726853639.76283: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853639.76416: variable '__network_wireless_connections_defined' from source: role '' defaults 28023 1726853639.76788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853639.81789: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853639.81945: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853639.82216: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853639.82264: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853639.82324: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853639.82483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853639.82549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853639.82652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853639.82698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853639.82790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853639.82878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853639.82907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853639.82936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853639.82985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853639.83006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853639.83049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853639.83083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853639.83110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853639.83150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853639.83175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853639.83350: variable 'network_connections' from source: include params 28023 1726853639.83374: variable 'interface0' from source: play vars 28023 1726853639.83460: variable 'interface0' from source: play vars 28023 1726853639.83493: variable 'interface1' from source: play vars 28023 1726853639.83544: variable 'interface1' from source: play vars 28023 1726853639.83711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853639.83809: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853639.83855: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853639.83889: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853639.83924: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853639.83975: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853639.84003: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853639.84033: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853639.84070: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853639.84124: variable '__network_team_connections_defined' from source: role '' defaults 28023 1726853639.84380: variable 'network_connections' from source: include params 28023 1726853639.84389: variable 'interface0' from source: play vars 28023 1726853639.84451: variable 'interface0' from source: play vars 28023 1726853639.84464: variable 'interface1' from source: play vars 28023 1726853639.84527: variable 'interface1' from source: play vars 28023 1726853639.84585: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28023 1726853639.84588: when evaluation is False, skipping this task 28023 1726853639.84590: _execute() done 28023 1726853639.84592: dumping result to json 28023 1726853639.84594: done dumping result, returning 28023 1726853639.84596: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-fdb6-dad7-000000000648] 28023 1726853639.84606: sending task result for task 02083763-bbaf-fdb6-dad7-000000000648 28023 1726853639.84761: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000648 28023 1726853639.84764: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28023 1726853639.84850: no more pending results, returning what we have 28023 1726853639.84853: results queue empty 28023 1726853639.84854: checking for any_errors_fatal 28023 1726853639.84862: done checking for any_errors_fatal 28023 1726853639.84863: checking for max_fail_percentage 28023 1726853639.84865: done checking for max_fail_percentage 28023 1726853639.84865: checking to see if all hosts have failed and the running result is not ok 28023 1726853639.84866: done checking to see if all hosts have failed 28023 1726853639.84867: getting the remaining hosts for this loop 28023 1726853639.84869: done getting the remaining hosts for this loop 28023 1726853639.84876: getting the next task for host managed_node3 28023 1726853639.84886: done getting next task for host managed_node3 28023 1726853639.84891: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28023 1726853639.84896: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853639.84919: getting variables 28023 1726853639.84920: in VariableManager get_vars() 28023 1726853639.84973: Calling all_inventory to load vars for managed_node3 28023 1726853639.84976: Calling groups_inventory to load vars for managed_node3 28023 1726853639.84979: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853639.84993: Calling all_plugins_play to load vars for managed_node3 28023 1726853639.84997: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853639.85000: Calling groups_plugins_play to load vars for managed_node3 28023 1726853639.86536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853639.88130: done with get_vars() 28023 1726853639.88155: done getting variables 28023 1726853639.88216: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:33:59 -0400 (0:00:00.156) 0:00:31.966 ****** 28023 1726853639.88251: entering _queue_task() for managed_node3/package 28023 1726853639.88575: worker is 1 (out of 1 available) 28023 1726853639.88588: exiting _queue_task() for managed_node3/package 28023 1726853639.88601: done queuing things up, now waiting for results queue to drain 28023 1726853639.88603: waiting for pending results... 28023 1726853639.88989: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 28023 1726853639.89024: in run() - task 02083763-bbaf-fdb6-dad7-000000000649 28023 1726853639.89044: variable 'ansible_search_path' from source: unknown 28023 1726853639.89052: variable 'ansible_search_path' from source: unknown 28023 1726853639.89100: calling self._execute() 28023 1726853639.89213: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853639.89225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853639.89240: variable 'omit' from source: magic vars 28023 1726853639.89638: variable 'ansible_distribution_major_version' from source: facts 28023 1726853639.89656: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853639.89848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853639.90113: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853639.90156: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853639.90196: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853639.90977: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853639.91104: variable 'network_packages' from source: role '' defaults 28023 1726853639.91434: variable '__network_provider_setup' from source: role '' defaults 28023 1726853639.91475: variable '__network_service_name_default_nm' from source: role '' defaults 28023 1726853639.91774: variable '__network_service_name_default_nm' from source: role '' defaults 28023 1726853639.91778: variable '__network_packages_default_nm' from source: role '' defaults 28023 1726853639.91780: variable '__network_packages_default_nm' from source: role '' defaults 28023 1726853639.92154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853639.96276: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853639.96384: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853639.96578: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853639.96614: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853639.96678: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853639.96976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853639.96980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853639.96982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853639.97080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853639.97104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853639.97151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853639.97212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853639.97297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853639.97404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853639.97424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853639.97901: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28023 1726853639.98181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853639.98384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853639.98389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853639.98391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853639.98576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853639.98691: variable 'ansible_python' from source: facts 28023 1726853639.98724: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28023 1726853639.98810: variable '__network_wpa_supplicant_required' from source: role '' defaults 28023 1726853639.99007: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28023 1726853639.99313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853639.99340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853639.99389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853639.99510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853639.99528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853639.99776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853639.99787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853639.99791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853639.99794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853639.99980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853640.00235: variable 'network_connections' from source: include params 28023 1726853640.00344: variable 'interface0' from source: play vars 28023 1726853640.00463: variable 'interface0' from source: play vars 28023 1726853640.00481: variable 'interface1' from source: play vars 28023 1726853640.00695: variable 'interface1' from source: play vars 28023 1726853640.00774: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853640.00978: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853640.00982: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853640.00985: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853640.01037: variable '__network_wireless_connections_defined' from source: role '' defaults 28023 1726853640.01701: variable 'network_connections' from source: include params 28023 1726853640.01768: variable 'interface0' from source: play vars 28023 1726853640.01990: variable 'interface0' from source: play vars 28023 1726853640.02007: variable 'interface1' from source: play vars 28023 1726853640.02212: variable 'interface1' from source: play vars 28023 1726853640.02250: variable '__network_packages_default_wireless' from source: role '' defaults 28023 1726853640.02521: variable '__network_wireless_connections_defined' from source: role '' defaults 28023 1726853640.03036: variable 'network_connections' from source: include params 28023 1726853640.03378: variable 'interface0' from source: play vars 28023 1726853640.03381: variable 'interface0' from source: play vars 28023 1726853640.03383: variable 'interface1' from source: play vars 28023 1726853640.03422: variable 'interface1' from source: play vars 28023 1726853640.03451: variable '__network_packages_default_team' from source: role '' defaults 28023 1726853640.03558: variable '__network_team_connections_defined' from source: role '' defaults 28023 1726853640.03881: variable 'network_connections' from source: include params 28023 1726853640.03891: variable 'interface0' from source: play vars 28023 1726853640.03949: variable 'interface0' from source: play vars 28023 1726853640.03959: variable 'interface1' from source: play vars 28023 1726853640.04016: variable 'interface1' from source: play vars 28023 1726853640.04073: variable '__network_service_name_default_initscripts' from source: role '' defaults 28023 1726853640.04137: variable '__network_service_name_default_initscripts' from source: role '' defaults 28023 1726853640.04149: variable '__network_packages_default_initscripts' from source: role '' defaults 28023 1726853640.04212: variable '__network_packages_default_initscripts' from source: role '' defaults 28023 1726853640.04432: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28023 1726853640.04896: variable 'network_connections' from source: include params 28023 1726853640.04906: variable 'interface0' from source: play vars 28023 1726853640.04965: variable 'interface0' from source: play vars 28023 1726853640.04980: variable 'interface1' from source: play vars 28023 1726853640.05045: variable 'interface1' from source: play vars 28023 1726853640.05058: variable 'ansible_distribution' from source: facts 28023 1726853640.05066: variable '__network_rh_distros' from source: role '' defaults 28023 1726853640.05082: variable 'ansible_distribution_major_version' from source: facts 28023 1726853640.05099: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28023 1726853640.05265: variable 'ansible_distribution' from source: facts 28023 1726853640.05276: variable '__network_rh_distros' from source: role '' defaults 28023 1726853640.05286: variable 'ansible_distribution_major_version' from source: facts 28023 1726853640.05301: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28023 1726853640.05468: variable 'ansible_distribution' from source: facts 28023 1726853640.05479: variable '__network_rh_distros' from source: role '' defaults 28023 1726853640.05488: variable 'ansible_distribution_major_version' from source: facts 28023 1726853640.05523: variable 'network_provider' from source: set_fact 28023 1726853640.05546: variable 'ansible_facts' from source: unknown 28023 1726853640.06206: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28023 1726853640.06214: when evaluation is False, skipping this task 28023 1726853640.06221: _execute() done 28023 1726853640.06227: dumping result to json 28023 1726853640.06235: done dumping result, returning 28023 1726853640.06247: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-fdb6-dad7-000000000649] 28023 1726853640.06255: sending task result for task 02083763-bbaf-fdb6-dad7-000000000649 28023 1726853640.06476: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000649 28023 1726853640.06480: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28023 1726853640.06530: no more pending results, returning what we have 28023 1726853640.06533: results queue empty 28023 1726853640.06534: checking for any_errors_fatal 28023 1726853640.06540: done checking for any_errors_fatal 28023 1726853640.06541: checking for max_fail_percentage 28023 1726853640.06543: done checking for max_fail_percentage 28023 1726853640.06543: checking to see if all hosts have failed and the running result is not ok 28023 1726853640.06545: done checking to see if all hosts have failed 28023 1726853640.06545: getting the remaining hosts for this loop 28023 1726853640.06547: done getting the remaining hosts for this loop 28023 1726853640.06551: getting the next task for host managed_node3 28023 1726853640.06559: done getting next task for host managed_node3 28023 1726853640.06564: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28023 1726853640.06568: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853640.06595: getting variables 28023 1726853640.06596: in VariableManager get_vars() 28023 1726853640.06641: Calling all_inventory to load vars for managed_node3 28023 1726853640.06644: Calling groups_inventory to load vars for managed_node3 28023 1726853640.06647: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853640.06659: Calling all_plugins_play to load vars for managed_node3 28023 1726853640.06662: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853640.06665: Calling groups_plugins_play to load vars for managed_node3 28023 1726853640.08286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853640.09838: done with get_vars() 28023 1726853640.09865: done getting variables 28023 1726853640.09928: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:34:00 -0400 (0:00:00.217) 0:00:32.183 ****** 28023 1726853640.09961: entering _queue_task() for managed_node3/package 28023 1726853640.10300: worker is 1 (out of 1 available) 28023 1726853640.10311: exiting _queue_task() for managed_node3/package 28023 1726853640.10323: done queuing things up, now waiting for results queue to drain 28023 1726853640.10324: waiting for pending results... 28023 1726853640.10615: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28023 1726853640.10959: in run() - task 02083763-bbaf-fdb6-dad7-00000000064a 28023 1726853640.11068: variable 'ansible_search_path' from source: unknown 28023 1726853640.11074: variable 'ansible_search_path' from source: unknown 28023 1726853640.11077: calling self._execute() 28023 1726853640.11335: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853640.11444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853640.11448: variable 'omit' from source: magic vars 28023 1726853640.12191: variable 'ansible_distribution_major_version' from source: facts 28023 1726853640.12213: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853640.12376: variable 'network_state' from source: role '' defaults 28023 1726853640.12639: Evaluated conditional (network_state != {}): False 28023 1726853640.12643: when evaluation is False, skipping this task 28023 1726853640.12645: _execute() done 28023 1726853640.12648: dumping result to json 28023 1726853640.12650: done dumping result, returning 28023 1726853640.12653: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-fdb6-dad7-00000000064a] 28023 1726853640.12656: sending task result for task 02083763-bbaf-fdb6-dad7-00000000064a 28023 1726853640.12729: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000064a 28023 1726853640.12733: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28023 1726853640.12789: no more pending results, returning what we have 28023 1726853640.12794: results queue empty 28023 1726853640.12795: checking for any_errors_fatal 28023 1726853640.12803: done checking for any_errors_fatal 28023 1726853640.12803: checking for max_fail_percentage 28023 1726853640.12805: done checking for max_fail_percentage 28023 1726853640.12806: checking to see if all hosts have failed and the running result is not ok 28023 1726853640.12807: done checking to see if all hosts have failed 28023 1726853640.12808: getting the remaining hosts for this loop 28023 1726853640.12809: done getting the remaining hosts for this loop 28023 1726853640.12814: getting the next task for host managed_node3 28023 1726853640.12824: done getting next task for host managed_node3 28023 1726853640.12828: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28023 1726853640.12833: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853640.12856: getting variables 28023 1726853640.12858: in VariableManager get_vars() 28023 1726853640.12904: Calling all_inventory to load vars for managed_node3 28023 1726853640.12907: Calling groups_inventory to load vars for managed_node3 28023 1726853640.12910: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853640.12924: Calling all_plugins_play to load vars for managed_node3 28023 1726853640.12928: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853640.12931: Calling groups_plugins_play to load vars for managed_node3 28023 1726853640.15926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853640.19176: done with get_vars() 28023 1726853640.19201: done getting variables 28023 1726853640.19259: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:34:00 -0400 (0:00:00.097) 0:00:32.280 ****** 28023 1726853640.19699: entering _queue_task() for managed_node3/package 28023 1726853640.20158: worker is 1 (out of 1 available) 28023 1726853640.20170: exiting _queue_task() for managed_node3/package 28023 1726853640.20286: done queuing things up, now waiting for results queue to drain 28023 1726853640.20287: waiting for pending results... 28023 1726853640.20470: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28023 1726853640.20617: in run() - task 02083763-bbaf-fdb6-dad7-00000000064b 28023 1726853640.20641: variable 'ansible_search_path' from source: unknown 28023 1726853640.20649: variable 'ansible_search_path' from source: unknown 28023 1726853640.20689: calling self._execute() 28023 1726853640.20801: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853640.20813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853640.20826: variable 'omit' from source: magic vars 28023 1726853640.21376: variable 'ansible_distribution_major_version' from source: facts 28023 1726853640.21380: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853640.21382: variable 'network_state' from source: role '' defaults 28023 1726853640.21385: Evaluated conditional (network_state != {}): False 28023 1726853640.21387: when evaluation is False, skipping this task 28023 1726853640.21389: _execute() done 28023 1726853640.21391: dumping result to json 28023 1726853640.21393: done dumping result, returning 28023 1726853640.21396: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-fdb6-dad7-00000000064b] 28023 1726853640.21398: sending task result for task 02083763-bbaf-fdb6-dad7-00000000064b 28023 1726853640.21493: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000064b 28023 1726853640.21501: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28023 1726853640.21553: no more pending results, returning what we have 28023 1726853640.21556: results queue empty 28023 1726853640.21557: checking for any_errors_fatal 28023 1726853640.21565: done checking for any_errors_fatal 28023 1726853640.21566: checking for max_fail_percentage 28023 1726853640.21568: done checking for max_fail_percentage 28023 1726853640.21569: checking to see if all hosts have failed and the running result is not ok 28023 1726853640.21570: done checking to see if all hosts have failed 28023 1726853640.21572: getting the remaining hosts for this loop 28023 1726853640.21574: done getting the remaining hosts for this loop 28023 1726853640.21578: getting the next task for host managed_node3 28023 1726853640.21586: done getting next task for host managed_node3 28023 1726853640.21590: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28023 1726853640.21595: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853640.21616: getting variables 28023 1726853640.21617: in VariableManager get_vars() 28023 1726853640.21660: Calling all_inventory to load vars for managed_node3 28023 1726853640.21663: Calling groups_inventory to load vars for managed_node3 28023 1726853640.21666: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853640.21880: Calling all_plugins_play to load vars for managed_node3 28023 1726853640.21884: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853640.21888: Calling groups_plugins_play to load vars for managed_node3 28023 1726853640.23226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853640.24816: done with get_vars() 28023 1726853640.24839: done getting variables 28023 1726853640.24900: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:34:00 -0400 (0:00:00.052) 0:00:32.333 ****** 28023 1726853640.24935: entering _queue_task() for managed_node3/service 28023 1726853640.25252: worker is 1 (out of 1 available) 28023 1726853640.25264: exiting _queue_task() for managed_node3/service 28023 1726853640.25478: done queuing things up, now waiting for results queue to drain 28023 1726853640.25480: waiting for pending results... 28023 1726853640.25560: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28023 1726853640.25814: in run() - task 02083763-bbaf-fdb6-dad7-00000000064c 28023 1726853640.25818: variable 'ansible_search_path' from source: unknown 28023 1726853640.25821: variable 'ansible_search_path' from source: unknown 28023 1726853640.25823: calling self._execute() 28023 1726853640.25874: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853640.25886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853640.25898: variable 'omit' from source: magic vars 28023 1726853640.26272: variable 'ansible_distribution_major_version' from source: facts 28023 1726853640.26291: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853640.26413: variable '__network_wireless_connections_defined' from source: role '' defaults 28023 1726853640.26608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853640.30780: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853640.31541: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853640.31588: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853640.31865: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853640.31869: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853640.31898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853640.31934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853640.31967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853640.32016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853640.32031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853640.32080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853640.32107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853640.32131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853640.32169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853640.32192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853640.32231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853640.32256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853640.32284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853640.32327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853640.32376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853640.32513: variable 'network_connections' from source: include params 28023 1726853640.32530: variable 'interface0' from source: play vars 28023 1726853640.32608: variable 'interface0' from source: play vars 28023 1726853640.32627: variable 'interface1' from source: play vars 28023 1726853640.32688: variable 'interface1' from source: play vars 28023 1726853640.32764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853640.32944: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853640.32980: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853640.33053: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853640.33063: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853640.33114: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853640.33150: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853640.33186: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853640.33225: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853640.33299: variable '__network_team_connections_defined' from source: role '' defaults 28023 1726853640.33595: variable 'network_connections' from source: include params 28023 1726853640.33677: variable 'interface0' from source: play vars 28023 1726853640.33681: variable 'interface0' from source: play vars 28023 1726853640.33683: variable 'interface1' from source: play vars 28023 1726853640.33758: variable 'interface1' from source: play vars 28023 1726853640.33794: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28023 1726853640.33803: when evaluation is False, skipping this task 28023 1726853640.33814: _execute() done 28023 1726853640.33827: dumping result to json 28023 1726853640.33835: done dumping result, returning 28023 1726853640.33848: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-fdb6-dad7-00000000064c] 28023 1726853640.33866: sending task result for task 02083763-bbaf-fdb6-dad7-00000000064c 28023 1726853640.34279: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000064c 28023 1726853640.34283: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28023 1726853640.34327: no more pending results, returning what we have 28023 1726853640.34330: results queue empty 28023 1726853640.34331: checking for any_errors_fatal 28023 1726853640.34336: done checking for any_errors_fatal 28023 1726853640.34337: checking for max_fail_percentage 28023 1726853640.34339: done checking for max_fail_percentage 28023 1726853640.34340: checking to see if all hosts have failed and the running result is not ok 28023 1726853640.34341: done checking to see if all hosts have failed 28023 1726853640.34341: getting the remaining hosts for this loop 28023 1726853640.34343: done getting the remaining hosts for this loop 28023 1726853640.34346: getting the next task for host managed_node3 28023 1726853640.34353: done getting next task for host managed_node3 28023 1726853640.34357: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28023 1726853640.34361: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853640.34382: getting variables 28023 1726853640.34385: in VariableManager get_vars() 28023 1726853640.34427: Calling all_inventory to load vars for managed_node3 28023 1726853640.34430: Calling groups_inventory to load vars for managed_node3 28023 1726853640.34433: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853640.34443: Calling all_plugins_play to load vars for managed_node3 28023 1726853640.34447: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853640.34450: Calling groups_plugins_play to load vars for managed_node3 28023 1726853640.36219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853640.38558: done with get_vars() 28023 1726853640.38586: done getting variables 28023 1726853640.38649: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:34:00 -0400 (0:00:00.137) 0:00:32.470 ****** 28023 1726853640.38683: entering _queue_task() for managed_node3/service 28023 1726853640.39023: worker is 1 (out of 1 available) 28023 1726853640.39036: exiting _queue_task() for managed_node3/service 28023 1726853640.39047: done queuing things up, now waiting for results queue to drain 28023 1726853640.39048: waiting for pending results... 28023 1726853640.39347: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28023 1726853640.39678: in run() - task 02083763-bbaf-fdb6-dad7-00000000064d 28023 1726853640.39682: variable 'ansible_search_path' from source: unknown 28023 1726853640.39685: variable 'ansible_search_path' from source: unknown 28023 1726853640.39687: calling self._execute() 28023 1726853640.39690: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853640.39692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853640.39694: variable 'omit' from source: magic vars 28023 1726853640.40145: variable 'ansible_distribution_major_version' from source: facts 28023 1726853640.40166: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853640.40332: variable 'network_provider' from source: set_fact 28023 1726853640.40336: variable 'network_state' from source: role '' defaults 28023 1726853640.40347: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28023 1726853640.40352: variable 'omit' from source: magic vars 28023 1726853640.40415: variable 'omit' from source: magic vars 28023 1726853640.40441: variable 'network_service_name' from source: role '' defaults 28023 1726853640.40517: variable 'network_service_name' from source: role '' defaults 28023 1726853640.40627: variable '__network_provider_setup' from source: role '' defaults 28023 1726853640.40633: variable '__network_service_name_default_nm' from source: role '' defaults 28023 1726853640.40694: variable '__network_service_name_default_nm' from source: role '' defaults 28023 1726853640.40711: variable '__network_packages_default_nm' from source: role '' defaults 28023 1726853640.40769: variable '__network_packages_default_nm' from source: role '' defaults 28023 1726853640.40970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853640.43565: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853640.43639: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853640.43689: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853640.43720: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853640.43746: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853640.43976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853640.43980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853640.43983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853640.43986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853640.43988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853640.43999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853640.44026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853640.44050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853640.44094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853640.44114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853640.44367: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28023 1726853640.44585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853640.44589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853640.44610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853640.44656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853640.44670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853640.44819: variable 'ansible_python' from source: facts 28023 1726853640.44849: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28023 1726853640.44963: variable '__network_wpa_supplicant_required' from source: role '' defaults 28023 1726853640.45041: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28023 1726853640.45183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853640.45204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853640.45229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853640.45265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853640.45292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853640.45336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853640.45378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853640.45381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853640.45428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853640.45443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853640.45594: variable 'network_connections' from source: include params 28023 1726853640.45598: variable 'interface0' from source: play vars 28023 1726853640.45703: variable 'interface0' from source: play vars 28023 1726853640.45706: variable 'interface1' from source: play vars 28023 1726853640.45751: variable 'interface1' from source: play vars 28023 1726853640.45862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853640.46066: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853640.46117: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853640.46247: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853640.46251: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853640.46274: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853640.46303: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853640.46334: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853640.46375: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853640.46421: variable '__network_wireless_connections_defined' from source: role '' defaults 28023 1726853640.46845: variable 'network_connections' from source: include params 28023 1726853640.46901: variable 'interface0' from source: play vars 28023 1726853640.47010: variable 'interface0' from source: play vars 28023 1726853640.47013: variable 'interface1' from source: play vars 28023 1726853640.47022: variable 'interface1' from source: play vars 28023 1726853640.47059: variable '__network_packages_default_wireless' from source: role '' defaults 28023 1726853640.47143: variable '__network_wireless_connections_defined' from source: role '' defaults 28023 1726853640.47421: variable 'network_connections' from source: include params 28023 1726853640.47434: variable 'interface0' from source: play vars 28023 1726853640.47503: variable 'interface0' from source: play vars 28023 1726853640.47514: variable 'interface1' from source: play vars 28023 1726853640.47591: variable 'interface1' from source: play vars 28023 1726853640.47617: variable '__network_packages_default_team' from source: role '' defaults 28023 1726853640.47700: variable '__network_team_connections_defined' from source: role '' defaults 28023 1726853640.48084: variable 'network_connections' from source: include params 28023 1726853640.48095: variable 'interface0' from source: play vars 28023 1726853640.48194: variable 'interface0' from source: play vars 28023 1726853640.48207: variable 'interface1' from source: play vars 28023 1726853640.48319: variable 'interface1' from source: play vars 28023 1726853640.48378: variable '__network_service_name_default_initscripts' from source: role '' defaults 28023 1726853640.48444: variable '__network_service_name_default_initscripts' from source: role '' defaults 28023 1726853640.48476: variable '__network_packages_default_initscripts' from source: role '' defaults 28023 1726853640.48525: variable '__network_packages_default_initscripts' from source: role '' defaults 28023 1726853640.48752: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28023 1726853640.49289: variable 'network_connections' from source: include params 28023 1726853640.49477: variable 'interface0' from source: play vars 28023 1726853640.49480: variable 'interface0' from source: play vars 28023 1726853640.49483: variable 'interface1' from source: play vars 28023 1726853640.49485: variable 'interface1' from source: play vars 28023 1726853640.49487: variable 'ansible_distribution' from source: facts 28023 1726853640.49489: variable '__network_rh_distros' from source: role '' defaults 28023 1726853640.49491: variable 'ansible_distribution_major_version' from source: facts 28023 1726853640.49493: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28023 1726853640.49651: variable 'ansible_distribution' from source: facts 28023 1726853640.49660: variable '__network_rh_distros' from source: role '' defaults 28023 1726853640.49670: variable 'ansible_distribution_major_version' from source: facts 28023 1726853640.49689: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28023 1726853640.49864: variable 'ansible_distribution' from source: facts 28023 1726853640.49878: variable '__network_rh_distros' from source: role '' defaults 28023 1726853640.49890: variable 'ansible_distribution_major_version' from source: facts 28023 1726853640.49934: variable 'network_provider' from source: set_fact 28023 1726853640.49966: variable 'omit' from source: magic vars 28023 1726853640.50003: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853640.50034: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853640.50068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853640.50095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853640.50155: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853640.50159: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853640.50161: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853640.50163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853640.50268: Set connection var ansible_shell_type to sh 28023 1726853640.50284: Set connection var ansible_shell_executable to /bin/sh 28023 1726853640.50295: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853640.50305: Set connection var ansible_connection to ssh 28023 1726853640.50316: Set connection var ansible_pipelining to False 28023 1726853640.50327: Set connection var ansible_timeout to 10 28023 1726853640.50375: variable 'ansible_shell_executable' from source: unknown 28023 1726853640.50379: variable 'ansible_connection' from source: unknown 28023 1726853640.50381: variable 'ansible_module_compression' from source: unknown 28023 1726853640.50479: variable 'ansible_shell_type' from source: unknown 28023 1726853640.50482: variable 'ansible_shell_executable' from source: unknown 28023 1726853640.50489: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853640.50492: variable 'ansible_pipelining' from source: unknown 28023 1726853640.50494: variable 'ansible_timeout' from source: unknown 28023 1726853640.50496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853640.50542: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853640.50559: variable 'omit' from source: magic vars 28023 1726853640.50570: starting attempt loop 28023 1726853640.50588: running the handler 28023 1726853640.50666: variable 'ansible_facts' from source: unknown 28023 1726853640.51494: _low_level_execute_command(): starting 28023 1726853640.51507: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853640.52294: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853640.52360: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853640.52384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853640.52408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853640.52528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853640.54523: stdout chunk (state=3): >>>/root <<< 28023 1726853640.54527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853640.54529: stdout chunk (state=3): >>><<< 28023 1726853640.54531: stderr chunk (state=3): >>><<< 28023 1726853640.54534: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853640.54577: _low_level_execute_command(): starting 28023 1726853640.54581: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853640.545104-29482-256020737080011 `" && echo ansible-tmp-1726853640.545104-29482-256020737080011="` echo /root/.ansible/tmp/ansible-tmp-1726853640.545104-29482-256020737080011 `" ) && sleep 0' 28023 1726853640.55222: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853640.55239: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853640.55258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853640.55381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853640.55386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853640.55401: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853640.55418: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853640.55520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853640.57539: stdout chunk (state=3): >>>ansible-tmp-1726853640.545104-29482-256020737080011=/root/.ansible/tmp/ansible-tmp-1726853640.545104-29482-256020737080011 <<< 28023 1726853640.57690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853640.57700: stdout chunk (state=3): >>><<< 28023 1726853640.57726: stderr chunk (state=3): >>><<< 28023 1726853640.57743: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853640.545104-29482-256020737080011=/root/.ansible/tmp/ansible-tmp-1726853640.545104-29482-256020737080011 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853640.57876: variable 'ansible_module_compression' from source: unknown 28023 1726853640.57879: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 28023 1726853640.57917: variable 'ansible_facts' from source: unknown 28023 1726853640.58349: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853640.545104-29482-256020737080011/AnsiballZ_systemd.py 28023 1726853640.58513: Sending initial data 28023 1726853640.58522: Sent initial data (155 bytes) 28023 1726853640.59166: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853640.59183: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853640.59196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853640.59213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853640.59229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853640.59241: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853640.59289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853640.59303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853640.59383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853640.59400: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853640.59421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853640.59505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853640.61211: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853640.61268: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853640.61354: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpopwwx9j3 /root/.ansible/tmp/ansible-tmp-1726853640.545104-29482-256020737080011/AnsiballZ_systemd.py <<< 28023 1726853640.61357: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853640.545104-29482-256020737080011/AnsiballZ_systemd.py" <<< 28023 1726853640.61405: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpopwwx9j3" to remote "/root/.ansible/tmp/ansible-tmp-1726853640.545104-29482-256020737080011/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853640.545104-29482-256020737080011/AnsiballZ_systemd.py" <<< 28023 1726853640.63430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853640.63453: stderr chunk (state=3): >>><<< 28023 1726853640.63477: stdout chunk (state=3): >>><<< 28023 1726853640.63489: done transferring module to remote 28023 1726853640.63503: _low_level_execute_command(): starting 28023 1726853640.63518: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853640.545104-29482-256020737080011/ /root/.ansible/tmp/ansible-tmp-1726853640.545104-29482-256020737080011/AnsiballZ_systemd.py && sleep 0' 28023 1726853640.64144: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853640.64157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853640.64173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853640.64248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853640.64292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853640.64314: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853640.64327: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853640.64420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853640.66334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853640.66345: stdout chunk (state=3): >>><<< 28023 1726853640.66361: stderr chunk (state=3): >>><<< 28023 1726853640.66460: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853640.66463: _low_level_execute_command(): starting 28023 1726853640.66466: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853640.545104-29482-256020737080011/AnsiballZ_systemd.py && sleep 0' 28023 1726853640.67077: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853640.67091: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853640.67109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853640.67210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853640.67214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853640.67244: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853640.67342: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853640.97150: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10719232", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3315904512", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1937950000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysinit.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28023 1726853640.99327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853640.99331: stdout chunk (state=3): >>><<< 28023 1726853640.99333: stderr chunk (state=3): >>><<< 28023 1726853640.99337: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10719232", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3315904512", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1937950000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysinit.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853640.99655: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853640.545104-29482-256020737080011/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853640.99659: _low_level_execute_command(): starting 28023 1726853640.99662: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853640.545104-29482-256020737080011/ > /dev/null 2>&1 && sleep 0' 28023 1726853641.00185: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853641.00194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853641.00205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853641.00224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853641.00330: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853641.00357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853641.00449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853641.02351: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853641.02401: stderr chunk (state=3): >>><<< 28023 1726853641.02412: stdout chunk (state=3): >>><<< 28023 1726853641.02444: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853641.02576: handler run complete 28023 1726853641.02579: attempt loop complete, returning result 28023 1726853641.02582: _execute() done 28023 1726853641.02584: dumping result to json 28023 1726853641.02586: done dumping result, returning 28023 1726853641.02588: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-fdb6-dad7-00000000064d] 28023 1726853641.02590: sending task result for task 02083763-bbaf-fdb6-dad7-00000000064d 28023 1726853641.03151: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000064d 28023 1726853641.03154: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28023 1726853641.03213: no more pending results, returning what we have 28023 1726853641.03217: results queue empty 28023 1726853641.03217: checking for any_errors_fatal 28023 1726853641.03223: done checking for any_errors_fatal 28023 1726853641.03223: checking for max_fail_percentage 28023 1726853641.03225: done checking for max_fail_percentage 28023 1726853641.03226: checking to see if all hosts have failed and the running result is not ok 28023 1726853641.03227: done checking to see if all hosts have failed 28023 1726853641.03228: getting the remaining hosts for this loop 28023 1726853641.03229: done getting the remaining hosts for this loop 28023 1726853641.03233: getting the next task for host managed_node3 28023 1726853641.03241: done getting next task for host managed_node3 28023 1726853641.03245: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28023 1726853641.03249: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853641.03261: getting variables 28023 1726853641.03377: in VariableManager get_vars() 28023 1726853641.03419: Calling all_inventory to load vars for managed_node3 28023 1726853641.03422: Calling groups_inventory to load vars for managed_node3 28023 1726853641.03424: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853641.03437: Calling all_plugins_play to load vars for managed_node3 28023 1726853641.03440: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853641.03443: Calling groups_plugins_play to load vars for managed_node3 28023 1726853641.05683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853641.07422: done with get_vars() 28023 1726853641.07448: done getting variables 28023 1726853641.07516: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:34:01 -0400 (0:00:00.688) 0:00:33.159 ****** 28023 1726853641.07553: entering _queue_task() for managed_node3/service 28023 1726853641.07934: worker is 1 (out of 1 available) 28023 1726853641.07946: exiting _queue_task() for managed_node3/service 28023 1726853641.07960: done queuing things up, now waiting for results queue to drain 28023 1726853641.07961: waiting for pending results... 28023 1726853641.08394: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28023 1726853641.08401: in run() - task 02083763-bbaf-fdb6-dad7-00000000064e 28023 1726853641.08413: variable 'ansible_search_path' from source: unknown 28023 1726853641.08421: variable 'ansible_search_path' from source: unknown 28023 1726853641.08490: calling self._execute() 28023 1726853641.08579: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853641.08594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853641.08612: variable 'omit' from source: magic vars 28023 1726853641.09037: variable 'ansible_distribution_major_version' from source: facts 28023 1726853641.09041: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853641.09153: variable 'network_provider' from source: set_fact 28023 1726853641.09168: Evaluated conditional (network_provider == "nm"): True 28023 1726853641.09267: variable '__network_wpa_supplicant_required' from source: role '' defaults 28023 1726853641.09365: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28023 1726853641.09579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853641.11730: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853641.11810: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853641.11853: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853641.11976: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853641.11979: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853641.12024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853641.12060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853641.12097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853641.12140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853641.12161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853641.12216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853641.12243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853641.12275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853641.12321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853641.12339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853641.12418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853641.12422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853641.12442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853641.12490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853641.12508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853641.12667: variable 'network_connections' from source: include params 28023 1726853641.12745: variable 'interface0' from source: play vars 28023 1726853641.12777: variable 'interface0' from source: play vars 28023 1726853641.12796: variable 'interface1' from source: play vars 28023 1726853641.12873: variable 'interface1' from source: play vars 28023 1726853641.12948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28023 1726853641.13127: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28023 1726853641.13170: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28023 1726853641.13210: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28023 1726853641.13243: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28023 1726853641.13297: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28023 1726853641.13475: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28023 1726853641.13478: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853641.13480: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28023 1726853641.13483: variable '__network_wireless_connections_defined' from source: role '' defaults 28023 1726853641.13674: variable 'network_connections' from source: include params 28023 1726853641.13686: variable 'interface0' from source: play vars 28023 1726853641.13755: variable 'interface0' from source: play vars 28023 1726853641.13774: variable 'interface1' from source: play vars 28023 1726853641.13844: variable 'interface1' from source: play vars 28023 1726853641.13884: Evaluated conditional (__network_wpa_supplicant_required): False 28023 1726853641.13892: when evaluation is False, skipping this task 28023 1726853641.13908: _execute() done 28023 1726853641.13916: dumping result to json 28023 1726853641.13930: done dumping result, returning 28023 1726853641.13943: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-fdb6-dad7-00000000064e] 28023 1726853641.13952: sending task result for task 02083763-bbaf-fdb6-dad7-00000000064e 28023 1726853641.14209: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000064e 28023 1726853641.14213: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28023 1726853641.14264: no more pending results, returning what we have 28023 1726853641.14268: results queue empty 28023 1726853641.14269: checking for any_errors_fatal 28023 1726853641.14299: done checking for any_errors_fatal 28023 1726853641.14301: checking for max_fail_percentage 28023 1726853641.14303: done checking for max_fail_percentage 28023 1726853641.14303: checking to see if all hosts have failed and the running result is not ok 28023 1726853641.14304: done checking to see if all hosts have failed 28023 1726853641.14305: getting the remaining hosts for this loop 28023 1726853641.14307: done getting the remaining hosts for this loop 28023 1726853641.14311: getting the next task for host managed_node3 28023 1726853641.14320: done getting next task for host managed_node3 28023 1726853641.14324: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28023 1726853641.14329: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853641.14351: getting variables 28023 1726853641.14353: in VariableManager get_vars() 28023 1726853641.14603: Calling all_inventory to load vars for managed_node3 28023 1726853641.14606: Calling groups_inventory to load vars for managed_node3 28023 1726853641.14608: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853641.14620: Calling all_plugins_play to load vars for managed_node3 28023 1726853641.14623: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853641.14626: Calling groups_plugins_play to load vars for managed_node3 28023 1726853641.16009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853641.17618: done with get_vars() 28023 1726853641.17651: done getting variables 28023 1726853641.17714: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:34:01 -0400 (0:00:00.101) 0:00:33.261 ****** 28023 1726853641.17750: entering _queue_task() for managed_node3/service 28023 1726853641.18119: worker is 1 (out of 1 available) 28023 1726853641.18131: exiting _queue_task() for managed_node3/service 28023 1726853641.18145: done queuing things up, now waiting for results queue to drain 28023 1726853641.18146: waiting for pending results... 28023 1726853641.18425: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 28023 1726853641.18521: in run() - task 02083763-bbaf-fdb6-dad7-00000000064f 28023 1726853641.18535: variable 'ansible_search_path' from source: unknown 28023 1726853641.18539: variable 'ansible_search_path' from source: unknown 28023 1726853641.18570: calling self._execute() 28023 1726853641.18662: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853641.18666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853641.18675: variable 'omit' from source: magic vars 28023 1726853641.18964: variable 'ansible_distribution_major_version' from source: facts 28023 1726853641.18972: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853641.19051: variable 'network_provider' from source: set_fact 28023 1726853641.19054: Evaluated conditional (network_provider == "initscripts"): False 28023 1726853641.19061: when evaluation is False, skipping this task 28023 1726853641.19064: _execute() done 28023 1726853641.19067: dumping result to json 28023 1726853641.19069: done dumping result, returning 28023 1726853641.19080: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-fdb6-dad7-00000000064f] 28023 1726853641.19083: sending task result for task 02083763-bbaf-fdb6-dad7-00000000064f 28023 1726853641.19172: done sending task result for task 02083763-bbaf-fdb6-dad7-00000000064f 28023 1726853641.19176: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28023 1726853641.19218: no more pending results, returning what we have 28023 1726853641.19221: results queue empty 28023 1726853641.19222: checking for any_errors_fatal 28023 1726853641.19230: done checking for any_errors_fatal 28023 1726853641.19231: checking for max_fail_percentage 28023 1726853641.19233: done checking for max_fail_percentage 28023 1726853641.19234: checking to see if all hosts have failed and the running result is not ok 28023 1726853641.19235: done checking to see if all hosts have failed 28023 1726853641.19235: getting the remaining hosts for this loop 28023 1726853641.19237: done getting the remaining hosts for this loop 28023 1726853641.19240: getting the next task for host managed_node3 28023 1726853641.19247: done getting next task for host managed_node3 28023 1726853641.19251: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28023 1726853641.19255: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853641.19282: getting variables 28023 1726853641.19283: in VariableManager get_vars() 28023 1726853641.19327: Calling all_inventory to load vars for managed_node3 28023 1726853641.19329: Calling groups_inventory to load vars for managed_node3 28023 1726853641.19331: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853641.19342: Calling all_plugins_play to load vars for managed_node3 28023 1726853641.19344: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853641.19347: Calling groups_plugins_play to load vars for managed_node3 28023 1726853641.20576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853641.21670: done with get_vars() 28023 1726853641.21697: done getting variables 28023 1726853641.21741: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:34:01 -0400 (0:00:00.040) 0:00:33.301 ****** 28023 1726853641.21772: entering _queue_task() for managed_node3/copy 28023 1726853641.22041: worker is 1 (out of 1 available) 28023 1726853641.22053: exiting _queue_task() for managed_node3/copy 28023 1726853641.22069: done queuing things up, now waiting for results queue to drain 28023 1726853641.22072: waiting for pending results... 28023 1726853641.22250: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28023 1726853641.22338: in run() - task 02083763-bbaf-fdb6-dad7-000000000650 28023 1726853641.22350: variable 'ansible_search_path' from source: unknown 28023 1726853641.22354: variable 'ansible_search_path' from source: unknown 28023 1726853641.22387: calling self._execute() 28023 1726853641.22469: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853641.22475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853641.22485: variable 'omit' from source: magic vars 28023 1726853641.22764: variable 'ansible_distribution_major_version' from source: facts 28023 1726853641.22775: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853641.22850: variable 'network_provider' from source: set_fact 28023 1726853641.22854: Evaluated conditional (network_provider == "initscripts"): False 28023 1726853641.22860: when evaluation is False, skipping this task 28023 1726853641.22863: _execute() done 28023 1726853641.22866: dumping result to json 28023 1726853641.22868: done dumping result, returning 28023 1726853641.22878: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-fdb6-dad7-000000000650] 28023 1726853641.22882: sending task result for task 02083763-bbaf-fdb6-dad7-000000000650 28023 1726853641.22978: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000650 28023 1726853641.22981: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28023 1726853641.23026: no more pending results, returning what we have 28023 1726853641.23030: results queue empty 28023 1726853641.23031: checking for any_errors_fatal 28023 1726853641.23042: done checking for any_errors_fatal 28023 1726853641.23043: checking for max_fail_percentage 28023 1726853641.23048: done checking for max_fail_percentage 28023 1726853641.23049: checking to see if all hosts have failed and the running result is not ok 28023 1726853641.23050: done checking to see if all hosts have failed 28023 1726853641.23050: getting the remaining hosts for this loop 28023 1726853641.23052: done getting the remaining hosts for this loop 28023 1726853641.23056: getting the next task for host managed_node3 28023 1726853641.23066: done getting next task for host managed_node3 28023 1726853641.23069: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28023 1726853641.23076: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853641.23099: getting variables 28023 1726853641.23101: in VariableManager get_vars() 28023 1726853641.23140: Calling all_inventory to load vars for managed_node3 28023 1726853641.23142: Calling groups_inventory to load vars for managed_node3 28023 1726853641.23144: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853641.23154: Calling all_plugins_play to load vars for managed_node3 28023 1726853641.23156: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853641.23161: Calling groups_plugins_play to load vars for managed_node3 28023 1726853641.24643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853641.26425: done with get_vars() 28023 1726853641.26465: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:34:01 -0400 (0:00:00.047) 0:00:33.349 ****** 28023 1726853641.26559: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 28023 1726853641.26953: worker is 1 (out of 1 available) 28023 1726853641.26968: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 28023 1726853641.26988: done queuing things up, now waiting for results queue to drain 28023 1726853641.26990: waiting for pending results... 28023 1726853641.27327: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28023 1726853641.27362: in run() - task 02083763-bbaf-fdb6-dad7-000000000651 28023 1726853641.27418: variable 'ansible_search_path' from source: unknown 28023 1726853641.27422: variable 'ansible_search_path' from source: unknown 28023 1726853641.27429: calling self._execute() 28023 1726853641.27519: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853641.27523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853641.27552: variable 'omit' from source: magic vars 28023 1726853641.27899: variable 'ansible_distribution_major_version' from source: facts 28023 1726853641.28076: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853641.28080: variable 'omit' from source: magic vars 28023 1726853641.28082: variable 'omit' from source: magic vars 28023 1726853641.28129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28023 1726853641.30529: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28023 1726853641.30592: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28023 1726853641.30627: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28023 1726853641.30663: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28023 1726853641.30689: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28023 1726853641.30766: variable 'network_provider' from source: set_fact 28023 1726853641.30936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28023 1726853641.30941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28023 1726853641.30948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28023 1726853641.30990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28023 1726853641.31004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28023 1726853641.31154: variable 'omit' from source: magic vars 28023 1726853641.31186: variable 'omit' from source: magic vars 28023 1726853641.31285: variable 'network_connections' from source: include params 28023 1726853641.31296: variable 'interface0' from source: play vars 28023 1726853641.31364: variable 'interface0' from source: play vars 28023 1726853641.31378: variable 'interface1' from source: play vars 28023 1726853641.31429: variable 'interface1' from source: play vars 28023 1726853641.31567: variable 'omit' from source: magic vars 28023 1726853641.31576: variable '__lsr_ansible_managed' from source: task vars 28023 1726853641.31629: variable '__lsr_ansible_managed' from source: task vars 28023 1726853641.31897: Loaded config def from plugin (lookup/template) 28023 1726853641.31900: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28023 1726853641.31927: File lookup term: get_ansible_managed.j2 28023 1726853641.31931: variable 'ansible_search_path' from source: unknown 28023 1726853641.32076: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28023 1726853641.32081: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28023 1726853641.32084: variable 'ansible_search_path' from source: unknown 28023 1726853641.37905: variable 'ansible_managed' from source: unknown 28023 1726853641.38318: variable 'omit' from source: magic vars 28023 1726853641.38345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853641.38375: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853641.38503: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853641.38506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853641.38606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853641.38639: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853641.38643: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853641.38645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853641.38887: Set connection var ansible_shell_type to sh 28023 1726853641.38891: Set connection var ansible_shell_executable to /bin/sh 28023 1726853641.38893: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853641.38895: Set connection var ansible_connection to ssh 28023 1726853641.38897: Set connection var ansible_pipelining to False 28023 1726853641.38899: Set connection var ansible_timeout to 10 28023 1726853641.38900: variable 'ansible_shell_executable' from source: unknown 28023 1726853641.38902: variable 'ansible_connection' from source: unknown 28023 1726853641.38904: variable 'ansible_module_compression' from source: unknown 28023 1726853641.38906: variable 'ansible_shell_type' from source: unknown 28023 1726853641.38908: variable 'ansible_shell_executable' from source: unknown 28023 1726853641.38910: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853641.38912: variable 'ansible_pipelining' from source: unknown 28023 1726853641.38913: variable 'ansible_timeout' from source: unknown 28023 1726853641.38915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853641.39130: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28023 1726853641.39242: variable 'omit' from source: magic vars 28023 1726853641.39254: starting attempt loop 28023 1726853641.39275: running the handler 28023 1726853641.39289: _low_level_execute_command(): starting 28023 1726853641.39296: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853641.40063: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853641.40074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853641.40100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853641.40103: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 28023 1726853641.40124: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853641.40127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853641.40180: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853641.40184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853641.40255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853641.41995: stdout chunk (state=3): >>>/root <<< 28023 1726853641.42488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853641.42492: stdout chunk (state=3): >>><<< 28023 1726853641.42495: stderr chunk (state=3): >>><<< 28023 1726853641.42498: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853641.42501: _low_level_execute_command(): starting 28023 1726853641.42503: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853641.4239602-29522-104607923287095 `" && echo ansible-tmp-1726853641.4239602-29522-104607923287095="` echo /root/.ansible/tmp/ansible-tmp-1726853641.4239602-29522-104607923287095 `" ) && sleep 0' 28023 1726853641.43196: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853641.43208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853641.43221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853641.43322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853641.43534: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853641.43560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853641.45596: stdout chunk (state=3): >>>ansible-tmp-1726853641.4239602-29522-104607923287095=/root/.ansible/tmp/ansible-tmp-1726853641.4239602-29522-104607923287095 <<< 28023 1726853641.45646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853641.45830: stderr chunk (state=3): >>><<< 28023 1726853641.45834: stdout chunk (state=3): >>><<< 28023 1726853641.45837: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853641.4239602-29522-104607923287095=/root/.ansible/tmp/ansible-tmp-1726853641.4239602-29522-104607923287095 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853641.45976: variable 'ansible_module_compression' from source: unknown 28023 1726853641.45980: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 28023 1726853641.46039: variable 'ansible_facts' from source: unknown 28023 1726853641.46178: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853641.4239602-29522-104607923287095/AnsiballZ_network_connections.py 28023 1726853641.46360: Sending initial data 28023 1726853641.46439: Sent initial data (168 bytes) 28023 1726853641.46991: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853641.47004: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853641.47015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853641.47087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853641.47186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853641.47189: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853641.47192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853641.47255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853641.48939: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853641.49011: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853641.49088: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpbc67iont /root/.ansible/tmp/ansible-tmp-1726853641.4239602-29522-104607923287095/AnsiballZ_network_connections.py <<< 28023 1726853641.49092: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853641.4239602-29522-104607923287095/AnsiballZ_network_connections.py" <<< 28023 1726853641.49134: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpbc67iont" to remote "/root/.ansible/tmp/ansible-tmp-1726853641.4239602-29522-104607923287095/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853641.4239602-29522-104607923287095/AnsiballZ_network_connections.py" <<< 28023 1726853641.50468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853641.50478: stdout chunk (state=3): >>><<< 28023 1726853641.50482: stderr chunk (state=3): >>><<< 28023 1726853641.50484: done transferring module to remote 28023 1726853641.50486: _low_level_execute_command(): starting 28023 1726853641.50489: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853641.4239602-29522-104607923287095/ /root/.ansible/tmp/ansible-tmp-1726853641.4239602-29522-104607923287095/AnsiballZ_network_connections.py && sleep 0' 28023 1726853641.51091: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853641.51094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853641.51097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853641.51186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853641.51189: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853641.51240: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853641.51259: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853641.51284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853641.51404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853641.53480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853641.53485: stdout chunk (state=3): >>><<< 28023 1726853641.53488: stderr chunk (state=3): >>><<< 28023 1726853641.53490: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853641.53493: _low_level_execute_command(): starting 28023 1726853641.53495: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853641.4239602-29522-104607923287095/AnsiballZ_network_connections.py && sleep 0' 28023 1726853641.53967: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853641.53981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853641.53993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853641.54008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853641.54021: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853641.54029: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853641.54235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853641.54243: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853641.54273: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853641.96536: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_kko899i3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_kko899i3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/dac9b760-b2ba-4f99-bc3f-cd7e791a7d43: error=unknown <<< 28023 1726853641.98317: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_kko899i3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_kko899i3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest1/3c3e532f-c676-4575-81e5-c6f885a09e34: error=unknown <<< 28023 1726853641.98541: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28023 1726853642.00443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853642.00476: stderr chunk (state=3): >>><<< 28023 1726853642.00479: stdout chunk (state=3): >>><<< 28023 1726853642.00502: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_kko899i3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_kko899i3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/dac9b760-b2ba-4f99-bc3f-cd7e791a7d43: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_kko899i3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_kko899i3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest1/3c3e532f-c676-4575-81e5-c6f885a09e34: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853642.00534: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'ethtest1', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853641.4239602-29522-104607923287095/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853642.00542: _low_level_execute_command(): starting 28023 1726853642.00546: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853641.4239602-29522-104607923287095/ > /dev/null 2>&1 && sleep 0' 28023 1726853642.01023: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853642.01026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853642.01033: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 28023 1726853642.01036: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853642.01039: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853642.01109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853642.01112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853642.01118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853642.01216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853642.03137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853642.03161: stderr chunk (state=3): >>><<< 28023 1726853642.03165: stdout chunk (state=3): >>><<< 28023 1726853642.03185: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853642.03192: handler run complete 28023 1726853642.03216: attempt loop complete, returning result 28023 1726853642.03219: _execute() done 28023 1726853642.03221: dumping result to json 28023 1726853642.03224: done dumping result, returning 28023 1726853642.03234: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-fdb6-dad7-000000000651] 28023 1726853642.03237: sending task result for task 02083763-bbaf-fdb6-dad7-000000000651 28023 1726853642.03344: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000651 28023 1726853642.03346: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent", "state": "down" }, { "name": "ethtest1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 28023 1726853642.03442: no more pending results, returning what we have 28023 1726853642.03446: results queue empty 28023 1726853642.03447: checking for any_errors_fatal 28023 1726853642.03452: done checking for any_errors_fatal 28023 1726853642.03453: checking for max_fail_percentage 28023 1726853642.03454: done checking for max_fail_percentage 28023 1726853642.03455: checking to see if all hosts have failed and the running result is not ok 28023 1726853642.03456: done checking to see if all hosts have failed 28023 1726853642.03457: getting the remaining hosts for this loop 28023 1726853642.03458: done getting the remaining hosts for this loop 28023 1726853642.03461: getting the next task for host managed_node3 28023 1726853642.03468: done getting next task for host managed_node3 28023 1726853642.03479: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28023 1726853642.03483: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853642.03494: getting variables 28023 1726853642.03495: in VariableManager get_vars() 28023 1726853642.03537: Calling all_inventory to load vars for managed_node3 28023 1726853642.03539: Calling groups_inventory to load vars for managed_node3 28023 1726853642.03541: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853642.03551: Calling all_plugins_play to load vars for managed_node3 28023 1726853642.03553: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853642.03555: Calling groups_plugins_play to load vars for managed_node3 28023 1726853642.04497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853642.05353: done with get_vars() 28023 1726853642.05374: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:34:02 -0400 (0:00:00.788) 0:00:34.138 ****** 28023 1726853642.05438: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 28023 1726853642.05699: worker is 1 (out of 1 available) 28023 1726853642.05712: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 28023 1726853642.05724: done queuing things up, now waiting for results queue to drain 28023 1726853642.05726: waiting for pending results... 28023 1726853642.05918: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 28023 1726853642.06002: in run() - task 02083763-bbaf-fdb6-dad7-000000000652 28023 1726853642.06014: variable 'ansible_search_path' from source: unknown 28023 1726853642.06018: variable 'ansible_search_path' from source: unknown 28023 1726853642.06045: calling self._execute() 28023 1726853642.06127: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853642.06132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853642.06140: variable 'omit' from source: magic vars 28023 1726853642.06427: variable 'ansible_distribution_major_version' from source: facts 28023 1726853642.06438: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853642.06522: variable 'network_state' from source: role '' defaults 28023 1726853642.06532: Evaluated conditional (network_state != {}): False 28023 1726853642.06535: when evaluation is False, skipping this task 28023 1726853642.06538: _execute() done 28023 1726853642.06540: dumping result to json 28023 1726853642.06545: done dumping result, returning 28023 1726853642.06552: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-fdb6-dad7-000000000652] 28023 1726853642.06557: sending task result for task 02083763-bbaf-fdb6-dad7-000000000652 28023 1726853642.06645: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000652 28023 1726853642.06648: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28023 1726853642.06697: no more pending results, returning what we have 28023 1726853642.06700: results queue empty 28023 1726853642.06701: checking for any_errors_fatal 28023 1726853642.06712: done checking for any_errors_fatal 28023 1726853642.06713: checking for max_fail_percentage 28023 1726853642.06715: done checking for max_fail_percentage 28023 1726853642.06715: checking to see if all hosts have failed and the running result is not ok 28023 1726853642.06716: done checking to see if all hosts have failed 28023 1726853642.06717: getting the remaining hosts for this loop 28023 1726853642.06718: done getting the remaining hosts for this loop 28023 1726853642.06722: getting the next task for host managed_node3 28023 1726853642.06729: done getting next task for host managed_node3 28023 1726853642.06733: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28023 1726853642.06737: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853642.06759: getting variables 28023 1726853642.06760: in VariableManager get_vars() 28023 1726853642.06805: Calling all_inventory to load vars for managed_node3 28023 1726853642.06807: Calling groups_inventory to load vars for managed_node3 28023 1726853642.06809: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853642.06819: Calling all_plugins_play to load vars for managed_node3 28023 1726853642.06822: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853642.06824: Calling groups_plugins_play to load vars for managed_node3 28023 1726853642.07594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853642.08468: done with get_vars() 28023 1726853642.08487: done getting variables 28023 1726853642.08531: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:34:02 -0400 (0:00:00.031) 0:00:34.169 ****** 28023 1726853642.08553: entering _queue_task() for managed_node3/debug 28023 1726853642.08803: worker is 1 (out of 1 available) 28023 1726853642.08815: exiting _queue_task() for managed_node3/debug 28023 1726853642.08827: done queuing things up, now waiting for results queue to drain 28023 1726853642.08829: waiting for pending results... 28023 1726853642.09015: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28023 1726853642.09098: in run() - task 02083763-bbaf-fdb6-dad7-000000000653 28023 1726853642.09111: variable 'ansible_search_path' from source: unknown 28023 1726853642.09115: variable 'ansible_search_path' from source: unknown 28023 1726853642.09141: calling self._execute() 28023 1726853642.09219: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853642.09223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853642.09231: variable 'omit' from source: magic vars 28023 1726853642.09512: variable 'ansible_distribution_major_version' from source: facts 28023 1726853642.09522: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853642.09528: variable 'omit' from source: magic vars 28023 1726853642.09566: variable 'omit' from source: magic vars 28023 1726853642.09591: variable 'omit' from source: magic vars 28023 1726853642.09626: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853642.09654: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853642.09673: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853642.09687: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853642.09696: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853642.09725: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853642.09743: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853642.09748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853642.09831: Set connection var ansible_shell_type to sh 28023 1726853642.09838: Set connection var ansible_shell_executable to /bin/sh 28023 1726853642.09844: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853642.09848: Set connection var ansible_connection to ssh 28023 1726853642.09854: Set connection var ansible_pipelining to False 28023 1726853642.09861: Set connection var ansible_timeout to 10 28023 1726853642.09884: variable 'ansible_shell_executable' from source: unknown 28023 1726853642.09887: variable 'ansible_connection' from source: unknown 28023 1726853642.09890: variable 'ansible_module_compression' from source: unknown 28023 1726853642.09892: variable 'ansible_shell_type' from source: unknown 28023 1726853642.09894: variable 'ansible_shell_executable' from source: unknown 28023 1726853642.09896: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853642.09899: variable 'ansible_pipelining' from source: unknown 28023 1726853642.09903: variable 'ansible_timeout' from source: unknown 28023 1726853642.09907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853642.10008: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853642.10017: variable 'omit' from source: magic vars 28023 1726853642.10022: starting attempt loop 28023 1726853642.10026: running the handler 28023 1726853642.10122: variable '__network_connections_result' from source: set_fact 28023 1726853642.10182: handler run complete 28023 1726853642.10196: attempt loop complete, returning result 28023 1726853642.10198: _execute() done 28023 1726853642.10201: dumping result to json 28023 1726853642.10203: done dumping result, returning 28023 1726853642.10213: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-fdb6-dad7-000000000653] 28023 1726853642.10217: sending task result for task 02083763-bbaf-fdb6-dad7-000000000653 28023 1726853642.10302: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000653 28023 1726853642.10305: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 28023 1726853642.10369: no more pending results, returning what we have 28023 1726853642.10374: results queue empty 28023 1726853642.10375: checking for any_errors_fatal 28023 1726853642.10380: done checking for any_errors_fatal 28023 1726853642.10381: checking for max_fail_percentage 28023 1726853642.10382: done checking for max_fail_percentage 28023 1726853642.10383: checking to see if all hosts have failed and the running result is not ok 28023 1726853642.10384: done checking to see if all hosts have failed 28023 1726853642.10384: getting the remaining hosts for this loop 28023 1726853642.10386: done getting the remaining hosts for this loop 28023 1726853642.10389: getting the next task for host managed_node3 28023 1726853642.10396: done getting next task for host managed_node3 28023 1726853642.10400: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28023 1726853642.10405: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853642.10418: getting variables 28023 1726853642.10420: in VariableManager get_vars() 28023 1726853642.10462: Calling all_inventory to load vars for managed_node3 28023 1726853642.10465: Calling groups_inventory to load vars for managed_node3 28023 1726853642.10467: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853642.10483: Calling all_plugins_play to load vars for managed_node3 28023 1726853642.10486: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853642.10489: Calling groups_plugins_play to load vars for managed_node3 28023 1726853642.11389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853642.12776: done with get_vars() 28023 1726853642.12796: done getting variables 28023 1726853642.12865: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:34:02 -0400 (0:00:00.043) 0:00:34.212 ****** 28023 1726853642.12893: entering _queue_task() for managed_node3/debug 28023 1726853642.13159: worker is 1 (out of 1 available) 28023 1726853642.13174: exiting _queue_task() for managed_node3/debug 28023 1726853642.13188: done queuing things up, now waiting for results queue to drain 28023 1726853642.13189: waiting for pending results... 28023 1726853642.13374: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28023 1726853642.13465: in run() - task 02083763-bbaf-fdb6-dad7-000000000654 28023 1726853642.13480: variable 'ansible_search_path' from source: unknown 28023 1726853642.13484: variable 'ansible_search_path' from source: unknown 28023 1726853642.13511: calling self._execute() 28023 1726853642.13592: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853642.13596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853642.13604: variable 'omit' from source: magic vars 28023 1726853642.13883: variable 'ansible_distribution_major_version' from source: facts 28023 1726853642.13892: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853642.13898: variable 'omit' from source: magic vars 28023 1726853642.13934: variable 'omit' from source: magic vars 28023 1726853642.13964: variable 'omit' from source: magic vars 28023 1726853642.13993: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853642.14020: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853642.14036: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853642.14048: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853642.14063: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853642.14087: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853642.14091: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853642.14094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853642.14159: Set connection var ansible_shell_type to sh 28023 1726853642.14163: Set connection var ansible_shell_executable to /bin/sh 28023 1726853642.14168: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853642.14178: Set connection var ansible_connection to ssh 28023 1726853642.14182: Set connection var ansible_pipelining to False 28023 1726853642.14185: Set connection var ansible_timeout to 10 28023 1726853642.14206: variable 'ansible_shell_executable' from source: unknown 28023 1726853642.14209: variable 'ansible_connection' from source: unknown 28023 1726853642.14212: variable 'ansible_module_compression' from source: unknown 28023 1726853642.14215: variable 'ansible_shell_type' from source: unknown 28023 1726853642.14217: variable 'ansible_shell_executable' from source: unknown 28023 1726853642.14219: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853642.14221: variable 'ansible_pipelining' from source: unknown 28023 1726853642.14225: variable 'ansible_timeout' from source: unknown 28023 1726853642.14229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853642.14331: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853642.14341: variable 'omit' from source: magic vars 28023 1726853642.14346: starting attempt loop 28023 1726853642.14349: running the handler 28023 1726853642.14390: variable '__network_connections_result' from source: set_fact 28023 1726853642.14446: variable '__network_connections_result' from source: set_fact 28023 1726853642.14531: handler run complete 28023 1726853642.14547: attempt loop complete, returning result 28023 1726853642.14550: _execute() done 28023 1726853642.14552: dumping result to json 28023 1726853642.14560: done dumping result, returning 28023 1726853642.14565: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-fdb6-dad7-000000000654] 28023 1726853642.14570: sending task result for task 02083763-bbaf-fdb6-dad7-000000000654 28023 1726853642.14659: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000654 28023 1726853642.14662: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent", "state": "down" }, { "name": "ethtest1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 28023 1726853642.14748: no more pending results, returning what we have 28023 1726853642.14751: results queue empty 28023 1726853642.14752: checking for any_errors_fatal 28023 1726853642.14762: done checking for any_errors_fatal 28023 1726853642.14762: checking for max_fail_percentage 28023 1726853642.14764: done checking for max_fail_percentage 28023 1726853642.14765: checking to see if all hosts have failed and the running result is not ok 28023 1726853642.14766: done checking to see if all hosts have failed 28023 1726853642.14766: getting the remaining hosts for this loop 28023 1726853642.14768: done getting the remaining hosts for this loop 28023 1726853642.14772: getting the next task for host managed_node3 28023 1726853642.14780: done getting next task for host managed_node3 28023 1726853642.14783: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28023 1726853642.14787: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853642.14797: getting variables 28023 1726853642.14799: in VariableManager get_vars() 28023 1726853642.14835: Calling all_inventory to load vars for managed_node3 28023 1726853642.14837: Calling groups_inventory to load vars for managed_node3 28023 1726853642.14839: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853642.14848: Calling all_plugins_play to load vars for managed_node3 28023 1726853642.14850: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853642.14853: Calling groups_plugins_play to load vars for managed_node3 28023 1726853642.16232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853642.18060: done with get_vars() 28023 1726853642.18086: done getting variables 28023 1726853642.18151: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:34:02 -0400 (0:00:00.052) 0:00:34.265 ****** 28023 1726853642.18189: entering _queue_task() for managed_node3/debug 28023 1726853642.18777: worker is 1 (out of 1 available) 28023 1726853642.18786: exiting _queue_task() for managed_node3/debug 28023 1726853642.18797: done queuing things up, now waiting for results queue to drain 28023 1726853642.18798: waiting for pending results... 28023 1726853642.18991: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28023 1726853642.19055: in run() - task 02083763-bbaf-fdb6-dad7-000000000655 28023 1726853642.19132: variable 'ansible_search_path' from source: unknown 28023 1726853642.19135: variable 'ansible_search_path' from source: unknown 28023 1726853642.19137: calling self._execute() 28023 1726853642.19238: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853642.19250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853642.19267: variable 'omit' from source: magic vars 28023 1726853642.19708: variable 'ansible_distribution_major_version' from source: facts 28023 1726853642.19727: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853642.19895: variable 'network_state' from source: role '' defaults 28023 1726853642.19899: Evaluated conditional (network_state != {}): False 28023 1726853642.19902: when evaluation is False, skipping this task 28023 1726853642.19908: _execute() done 28023 1726853642.19916: dumping result to json 28023 1726853642.19955: done dumping result, returning 28023 1726853642.19961: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-fdb6-dad7-000000000655] 28023 1726853642.19964: sending task result for task 02083763-bbaf-fdb6-dad7-000000000655 28023 1726853642.20220: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000655 28023 1726853642.20224: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 28023 1726853642.20282: no more pending results, returning what we have 28023 1726853642.20286: results queue empty 28023 1726853642.20287: checking for any_errors_fatal 28023 1726853642.20298: done checking for any_errors_fatal 28023 1726853642.20299: checking for max_fail_percentage 28023 1726853642.20301: done checking for max_fail_percentage 28023 1726853642.20301: checking to see if all hosts have failed and the running result is not ok 28023 1726853642.20302: done checking to see if all hosts have failed 28023 1726853642.20303: getting the remaining hosts for this loop 28023 1726853642.20305: done getting the remaining hosts for this loop 28023 1726853642.20309: getting the next task for host managed_node3 28023 1726853642.20317: done getting next task for host managed_node3 28023 1726853642.20322: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28023 1726853642.20327: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853642.20356: getting variables 28023 1726853642.20361: in VariableManager get_vars() 28023 1726853642.20409: Calling all_inventory to load vars for managed_node3 28023 1726853642.20412: Calling groups_inventory to load vars for managed_node3 28023 1726853642.20415: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853642.20429: Calling all_plugins_play to load vars for managed_node3 28023 1726853642.20432: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853642.20436: Calling groups_plugins_play to load vars for managed_node3 28023 1726853642.21992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853642.23669: done with get_vars() 28023 1726853642.23698: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:34:02 -0400 (0:00:00.056) 0:00:34.321 ****** 28023 1726853642.23801: entering _queue_task() for managed_node3/ping 28023 1726853642.24165: worker is 1 (out of 1 available) 28023 1726853642.24178: exiting _queue_task() for managed_node3/ping 28023 1726853642.24191: done queuing things up, now waiting for results queue to drain 28023 1726853642.24192: waiting for pending results... 28023 1726853642.24567: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 28023 1726853642.24664: in run() - task 02083763-bbaf-fdb6-dad7-000000000656 28023 1726853642.24690: variable 'ansible_search_path' from source: unknown 28023 1726853642.24698: variable 'ansible_search_path' from source: unknown 28023 1726853642.24770: calling self._execute() 28023 1726853642.24863: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853642.24883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853642.24898: variable 'omit' from source: magic vars 28023 1726853642.25376: variable 'ansible_distribution_major_version' from source: facts 28023 1726853642.25380: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853642.25382: variable 'omit' from source: magic vars 28023 1726853642.25424: variable 'omit' from source: magic vars 28023 1726853642.25467: variable 'omit' from source: magic vars 28023 1726853642.25515: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853642.25566: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853642.25595: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853642.25616: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853642.25749: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853642.25752: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853642.25755: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853642.25760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853642.25800: Set connection var ansible_shell_type to sh 28023 1726853642.25814: Set connection var ansible_shell_executable to /bin/sh 28023 1726853642.25825: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853642.25836: Set connection var ansible_connection to ssh 28023 1726853642.25846: Set connection var ansible_pipelining to False 28023 1726853642.25866: Set connection var ansible_timeout to 10 28023 1726853642.25899: variable 'ansible_shell_executable' from source: unknown 28023 1726853642.25907: variable 'ansible_connection' from source: unknown 28023 1726853642.25915: variable 'ansible_module_compression' from source: unknown 28023 1726853642.25922: variable 'ansible_shell_type' from source: unknown 28023 1726853642.25929: variable 'ansible_shell_executable' from source: unknown 28023 1726853642.25937: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853642.25945: variable 'ansible_pipelining' from source: unknown 28023 1726853642.25951: variable 'ansible_timeout' from source: unknown 28023 1726853642.25968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853642.26192: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28023 1726853642.26209: variable 'omit' from source: magic vars 28023 1726853642.26221: starting attempt loop 28023 1726853642.26228: running the handler 28023 1726853642.26249: _low_level_execute_command(): starting 28023 1726853642.26265: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853642.27013: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853642.27059: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853642.27078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853642.27154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853642.27182: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853642.27228: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853642.27300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853642.29014: stdout chunk (state=3): >>>/root <<< 28023 1726853642.29107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853642.29282: stderr chunk (state=3): >>><<< 28023 1726853642.29285: stdout chunk (state=3): >>><<< 28023 1726853642.29288: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853642.29291: _low_level_execute_command(): starting 28023 1726853642.29293: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853642.291803-29558-160895721804623 `" && echo ansible-tmp-1726853642.291803-29558-160895721804623="` echo /root/.ansible/tmp/ansible-tmp-1726853642.291803-29558-160895721804623 `" ) && sleep 0' 28023 1726853642.29837: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853642.29888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853642.29980: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853642.29998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853642.30022: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853642.30128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853642.32146: stdout chunk (state=3): >>>ansible-tmp-1726853642.291803-29558-160895721804623=/root/.ansible/tmp/ansible-tmp-1726853642.291803-29558-160895721804623 <<< 28023 1726853642.32287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853642.32476: stderr chunk (state=3): >>><<< 28023 1726853642.32479: stdout chunk (state=3): >>><<< 28023 1726853642.32482: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853642.291803-29558-160895721804623=/root/.ansible/tmp/ansible-tmp-1726853642.291803-29558-160895721804623 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853642.32484: variable 'ansible_module_compression' from source: unknown 28023 1726853642.32486: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 28023 1726853642.32487: variable 'ansible_facts' from source: unknown 28023 1726853642.32540: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853642.291803-29558-160895721804623/AnsiballZ_ping.py 28023 1726853642.32780: Sending initial data 28023 1726853642.32783: Sent initial data (152 bytes) 28023 1726853642.33325: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853642.33341: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853642.33385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853642.33452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853642.33470: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853642.33493: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853642.33594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853642.35384: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853642.35432: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853642.35515: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpz4bu49w0 /root/.ansible/tmp/ansible-tmp-1726853642.291803-29558-160895721804623/AnsiballZ_ping.py <<< 28023 1726853642.35518: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853642.291803-29558-160895721804623/AnsiballZ_ping.py" <<< 28023 1726853642.35575: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpz4bu49w0" to remote "/root/.ansible/tmp/ansible-tmp-1726853642.291803-29558-160895721804623/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853642.291803-29558-160895721804623/AnsiballZ_ping.py" <<< 28023 1726853642.36436: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853642.36439: stdout chunk (state=3): >>><<< 28023 1726853642.36442: stderr chunk (state=3): >>><<< 28023 1726853642.36444: done transferring module to remote 28023 1726853642.36446: _low_level_execute_command(): starting 28023 1726853642.36448: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853642.291803-29558-160895721804623/ /root/.ansible/tmp/ansible-tmp-1726853642.291803-29558-160895721804623/AnsiballZ_ping.py && sleep 0' 28023 1726853642.37081: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853642.37097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853642.37117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853642.37186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853642.37238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853642.37255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853642.37283: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853642.37363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853642.39309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853642.39326: stdout chunk (state=3): >>><<< 28023 1726853642.39340: stderr chunk (state=3): >>><<< 28023 1726853642.39447: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853642.39451: _low_level_execute_command(): starting 28023 1726853642.39454: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853642.291803-29558-160895721804623/AnsiballZ_ping.py && sleep 0' 28023 1726853642.40038: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853642.40051: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853642.40068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853642.40136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853642.40193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853642.40210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853642.40245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853642.40349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853642.55797: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28023 1726853642.57215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853642.57242: stderr chunk (state=3): >>><<< 28023 1726853642.57246: stdout chunk (state=3): >>><<< 28023 1726853642.57264: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853642.57287: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853642.291803-29558-160895721804623/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853642.57294: _low_level_execute_command(): starting 28023 1726853642.57299: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853642.291803-29558-160895721804623/ > /dev/null 2>&1 && sleep 0' 28023 1726853642.57730: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853642.57761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853642.57764: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853642.57767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853642.57769: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853642.57773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853642.57830: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853642.57838: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853642.57840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853642.57894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853642.59769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853642.59791: stderr chunk (state=3): >>><<< 28023 1726853642.59794: stdout chunk (state=3): >>><<< 28023 1726853642.59806: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853642.59814: handler run complete 28023 1726853642.59826: attempt loop complete, returning result 28023 1726853642.59829: _execute() done 28023 1726853642.59832: dumping result to json 28023 1726853642.59835: done dumping result, returning 28023 1726853642.59843: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-fdb6-dad7-000000000656] 28023 1726853642.59846: sending task result for task 02083763-bbaf-fdb6-dad7-000000000656 28023 1726853642.59938: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000656 28023 1726853642.59940: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 28023 1726853642.60005: no more pending results, returning what we have 28023 1726853642.60008: results queue empty 28023 1726853642.60009: checking for any_errors_fatal 28023 1726853642.60015: done checking for any_errors_fatal 28023 1726853642.60016: checking for max_fail_percentage 28023 1726853642.60017: done checking for max_fail_percentage 28023 1726853642.60018: checking to see if all hosts have failed and the running result is not ok 28023 1726853642.60019: done checking to see if all hosts have failed 28023 1726853642.60019: getting the remaining hosts for this loop 28023 1726853642.60021: done getting the remaining hosts for this loop 28023 1726853642.60024: getting the next task for host managed_node3 28023 1726853642.60035: done getting next task for host managed_node3 28023 1726853642.60036: ^ task is: TASK: meta (role_complete) 28023 1726853642.60040: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853642.60050: getting variables 28023 1726853642.60052: in VariableManager get_vars() 28023 1726853642.60098: Calling all_inventory to load vars for managed_node3 28023 1726853642.60100: Calling groups_inventory to load vars for managed_node3 28023 1726853642.60103: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853642.60113: Calling all_plugins_play to load vars for managed_node3 28023 1726853642.60116: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853642.60118: Calling groups_plugins_play to load vars for managed_node3 28023 1726853642.61037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853642.61893: done with get_vars() 28023 1726853642.61910: done getting variables 28023 1726853642.61966: done queuing things up, now waiting for results queue to drain 28023 1726853642.61967: results queue empty 28023 1726853642.61968: checking for any_errors_fatal 28023 1726853642.61969: done checking for any_errors_fatal 28023 1726853642.61970: checking for max_fail_percentage 28023 1726853642.61972: done checking for max_fail_percentage 28023 1726853642.61973: checking to see if all hosts have failed and the running result is not ok 28023 1726853642.61973: done checking to see if all hosts have failed 28023 1726853642.61974: getting the remaining hosts for this loop 28023 1726853642.61974: done getting the remaining hosts for this loop 28023 1726853642.61976: getting the next task for host managed_node3 28023 1726853642.61979: done getting next task for host managed_node3 28023 1726853642.61981: ^ task is: TASK: Delete interface1 28023 1726853642.61982: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853642.61984: getting variables 28023 1726853642.61984: in VariableManager get_vars() 28023 1726853642.61994: Calling all_inventory to load vars for managed_node3 28023 1726853642.61995: Calling groups_inventory to load vars for managed_node3 28023 1726853642.61996: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853642.62000: Calling all_plugins_play to load vars for managed_node3 28023 1726853642.62001: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853642.62003: Calling groups_plugins_play to load vars for managed_node3 28023 1726853642.62616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853642.63460: done with get_vars() 28023 1726853642.63475: done getting variables TASK [Delete interface1] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:151 Friday 20 September 2024 13:34:02 -0400 (0:00:00.397) 0:00:34.719 ****** 28023 1726853642.63526: entering _queue_task() for managed_node3/include_tasks 28023 1726853642.63780: worker is 1 (out of 1 available) 28023 1726853642.63792: exiting _queue_task() for managed_node3/include_tasks 28023 1726853642.63805: done queuing things up, now waiting for results queue to drain 28023 1726853642.63806: waiting for pending results... 28023 1726853642.63994: running TaskExecutor() for managed_node3/TASK: Delete interface1 28023 1726853642.64067: in run() - task 02083763-bbaf-fdb6-dad7-0000000000b5 28023 1726853642.64081: variable 'ansible_search_path' from source: unknown 28023 1726853642.64112: calling self._execute() 28023 1726853642.64195: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853642.64199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853642.64207: variable 'omit' from source: magic vars 28023 1726853642.64486: variable 'ansible_distribution_major_version' from source: facts 28023 1726853642.64496: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853642.64503: _execute() done 28023 1726853642.64505: dumping result to json 28023 1726853642.64508: done dumping result, returning 28023 1726853642.64515: done running TaskExecutor() for managed_node3/TASK: Delete interface1 [02083763-bbaf-fdb6-dad7-0000000000b5] 28023 1726853642.64519: sending task result for task 02083763-bbaf-fdb6-dad7-0000000000b5 28023 1726853642.64604: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000000b5 28023 1726853642.64607: WORKER PROCESS EXITING 28023 1726853642.64632: no more pending results, returning what we have 28023 1726853642.64637: in VariableManager get_vars() 28023 1726853642.64684: Calling all_inventory to load vars for managed_node3 28023 1726853642.64687: Calling groups_inventory to load vars for managed_node3 28023 1726853642.64689: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853642.64702: Calling all_plugins_play to load vars for managed_node3 28023 1726853642.64705: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853642.64708: Calling groups_plugins_play to load vars for managed_node3 28023 1726853642.65592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853642.66430: done with get_vars() 28023 1726853642.66443: variable 'ansible_search_path' from source: unknown 28023 1726853642.66454: we have included files to process 28023 1726853642.66454: generating all_blocks data 28023 1726853642.66455: done generating all_blocks data 28023 1726853642.66460: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 28023 1726853642.66460: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 28023 1726853642.66462: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 28023 1726853642.66619: done processing included file 28023 1726853642.66621: iterating over new_blocks loaded from include file 28023 1726853642.66622: in VariableManager get_vars() 28023 1726853642.66634: done with get_vars() 28023 1726853642.66635: filtering new block on tags 28023 1726853642.66651: done filtering new block on tags 28023 1726853642.66653: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node3 28023 1726853642.66657: extending task lists for all hosts with included blocks 28023 1726853642.67344: done extending task lists 28023 1726853642.67346: done processing included files 28023 1726853642.67346: results queue empty 28023 1726853642.67347: checking for any_errors_fatal 28023 1726853642.67347: done checking for any_errors_fatal 28023 1726853642.67348: checking for max_fail_percentage 28023 1726853642.67349: done checking for max_fail_percentage 28023 1726853642.67349: checking to see if all hosts have failed and the running result is not ok 28023 1726853642.67350: done checking to see if all hosts have failed 28023 1726853642.67350: getting the remaining hosts for this loop 28023 1726853642.67351: done getting the remaining hosts for this loop 28023 1726853642.67353: getting the next task for host managed_node3 28023 1726853642.67355: done getting next task for host managed_node3 28023 1726853642.67357: ^ task is: TASK: Remove test interface if necessary 28023 1726853642.67359: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853642.67361: getting variables 28023 1726853642.67362: in VariableManager get_vars() 28023 1726853642.67372: Calling all_inventory to load vars for managed_node3 28023 1726853642.67375: Calling groups_inventory to load vars for managed_node3 28023 1726853642.67377: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853642.67381: Calling all_plugins_play to load vars for managed_node3 28023 1726853642.67383: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853642.67384: Calling groups_plugins_play to load vars for managed_node3 28023 1726853642.68044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853642.72684: done with get_vars() 28023 1726853642.72706: done getting variables 28023 1726853642.72745: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 13:34:02 -0400 (0:00:00.092) 0:00:34.811 ****** 28023 1726853642.72774: entering _queue_task() for managed_node3/command 28023 1726853642.73119: worker is 1 (out of 1 available) 28023 1726853642.73133: exiting _queue_task() for managed_node3/command 28023 1726853642.73147: done queuing things up, now waiting for results queue to drain 28023 1726853642.73148: waiting for pending results... 28023 1726853642.73450: running TaskExecutor() for managed_node3/TASK: Remove test interface if necessary 28023 1726853642.73543: in run() - task 02083763-bbaf-fdb6-dad7-000000000777 28023 1726853642.73547: variable 'ansible_search_path' from source: unknown 28023 1726853642.73550: variable 'ansible_search_path' from source: unknown 28023 1726853642.73651: calling self._execute() 28023 1726853642.73693: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853642.73704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853642.73717: variable 'omit' from source: magic vars 28023 1726853642.74107: variable 'ansible_distribution_major_version' from source: facts 28023 1726853642.74127: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853642.74139: variable 'omit' from source: magic vars 28023 1726853642.74192: variable 'omit' from source: magic vars 28023 1726853642.74289: variable 'interface' from source: set_fact 28023 1726853642.74317: variable 'omit' from source: magic vars 28023 1726853642.74363: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853642.74476: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853642.74480: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853642.74482: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853642.74485: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853642.74503: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853642.74511: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853642.74525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853642.74635: Set connection var ansible_shell_type to sh 28023 1726853642.74648: Set connection var ansible_shell_executable to /bin/sh 28023 1726853642.74658: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853642.74668: Set connection var ansible_connection to ssh 28023 1726853642.74680: Set connection var ansible_pipelining to False 28023 1726853642.74689: Set connection var ansible_timeout to 10 28023 1726853642.74718: variable 'ansible_shell_executable' from source: unknown 28023 1726853642.74726: variable 'ansible_connection' from source: unknown 28023 1726853642.74848: variable 'ansible_module_compression' from source: unknown 28023 1726853642.74852: variable 'ansible_shell_type' from source: unknown 28023 1726853642.74855: variable 'ansible_shell_executable' from source: unknown 28023 1726853642.74858: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853642.74859: variable 'ansible_pipelining' from source: unknown 28023 1726853642.74862: variable 'ansible_timeout' from source: unknown 28023 1726853642.74865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853642.74924: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853642.74942: variable 'omit' from source: magic vars 28023 1726853642.74957: starting attempt loop 28023 1726853642.74963: running the handler 28023 1726853642.74984: _low_level_execute_command(): starting 28023 1726853642.74996: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853642.75727: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853642.75791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853642.75820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853642.75839: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853642.75870: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853642.75967: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853642.77689: stdout chunk (state=3): >>>/root <<< 28023 1726853642.77841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853642.77856: stderr chunk (state=3): >>><<< 28023 1726853642.77884: stdout chunk (state=3): >>><<< 28023 1726853642.78025: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853642.78029: _low_level_execute_command(): starting 28023 1726853642.78032: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853642.779154-29578-193599683270589 `" && echo ansible-tmp-1726853642.779154-29578-193599683270589="` echo /root/.ansible/tmp/ansible-tmp-1726853642.779154-29578-193599683270589 `" ) && sleep 0' 28023 1726853642.78621: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853642.78634: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853642.78649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853642.78669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853642.78695: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853642.78715: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853642.78787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853642.78829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853642.78849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853642.78870: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853642.78959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853642.81007: stdout chunk (state=3): >>>ansible-tmp-1726853642.779154-29578-193599683270589=/root/.ansible/tmp/ansible-tmp-1726853642.779154-29578-193599683270589 <<< 28023 1726853642.81184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853642.81187: stdout chunk (state=3): >>><<< 28023 1726853642.81189: stderr chunk (state=3): >>><<< 28023 1726853642.81212: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853642.779154-29578-193599683270589=/root/.ansible/tmp/ansible-tmp-1726853642.779154-29578-193599683270589 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853642.81376: variable 'ansible_module_compression' from source: unknown 28023 1726853642.81379: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28023 1726853642.81382: variable 'ansible_facts' from source: unknown 28023 1726853642.81449: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853642.779154-29578-193599683270589/AnsiballZ_command.py 28023 1726853642.81623: Sending initial data 28023 1726853642.81632: Sent initial data (155 bytes) 28023 1726853642.82270: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853642.82327: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853642.82399: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853642.82416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853642.82450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853642.82540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853642.84230: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853642.84306: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853642.84389: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpasm72e8b /root/.ansible/tmp/ansible-tmp-1726853642.779154-29578-193599683270589/AnsiballZ_command.py <<< 28023 1726853642.84393: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853642.779154-29578-193599683270589/AnsiballZ_command.py" <<< 28023 1726853642.84473: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpasm72e8b" to remote "/root/.ansible/tmp/ansible-tmp-1726853642.779154-29578-193599683270589/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853642.779154-29578-193599683270589/AnsiballZ_command.py" <<< 28023 1726853642.85451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853642.85454: stdout chunk (state=3): >>><<< 28023 1726853642.85456: stderr chunk (state=3): >>><<< 28023 1726853642.85461: done transferring module to remote 28023 1726853642.85463: _low_level_execute_command(): starting 28023 1726853642.85465: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853642.779154-29578-193599683270589/ /root/.ansible/tmp/ansible-tmp-1726853642.779154-29578-193599683270589/AnsiballZ_command.py && sleep 0' 28023 1726853642.86088: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853642.86148: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853642.86170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853642.86194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853642.86286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853642.88206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853642.88210: stdout chunk (state=3): >>><<< 28023 1726853642.88212: stderr chunk (state=3): >>><<< 28023 1726853642.88319: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853642.88325: _low_level_execute_command(): starting 28023 1726853642.88328: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853642.779154-29578-193599683270589/AnsiballZ_command.py && sleep 0' 28023 1726853642.88984: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853642.88988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853642.89010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853642.89025: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853642.89127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853643.06477: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest1"], "start": "2024-09-20 13:34:03.050952", "end": "2024-09-20 13:34:03.063082", "delta": "0:00:00.012130", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28023 1726853643.08929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853643.08954: stderr chunk (state=3): >>><<< 28023 1726853643.08957: stdout chunk (state=3): >>><<< 28023 1726853643.08979: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest1"], "start": "2024-09-20 13:34:03.050952", "end": "2024-09-20 13:34:03.063082", "delta": "0:00:00.012130", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853643.09013: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853642.779154-29578-193599683270589/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853643.09020: _low_level_execute_command(): starting 28023 1726853643.09025: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853642.779154-29578-193599683270589/ > /dev/null 2>&1 && sleep 0' 28023 1726853643.09436: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853643.09465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853643.09469: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853643.09474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853643.09528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853643.09535: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853643.09537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853643.09594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853643.11861: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853643.11866: stdout chunk (state=3): >>><<< 28023 1726853643.11868: stderr chunk (state=3): >>><<< 28023 1726853643.11889: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853643.11894: handler run complete 28023 1726853643.11918: Evaluated conditional (False): False 28023 1726853643.11928: attempt loop complete, returning result 28023 1726853643.11931: _execute() done 28023 1726853643.11934: dumping result to json 28023 1726853643.11939: done dumping result, returning 28023 1726853643.11947: done running TaskExecutor() for managed_node3/TASK: Remove test interface if necessary [02083763-bbaf-fdb6-dad7-000000000777] 28023 1726853643.11952: sending task result for task 02083763-bbaf-fdb6-dad7-000000000777 28023 1726853643.12052: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000777 28023 1726853643.12055: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest1" ], "delta": "0:00:00.012130", "end": "2024-09-20 13:34:03.063082", "rc": 0, "start": "2024-09-20 13:34:03.050952" } 28023 1726853643.12127: no more pending results, returning what we have 28023 1726853643.12131: results queue empty 28023 1726853643.12131: checking for any_errors_fatal 28023 1726853643.12133: done checking for any_errors_fatal 28023 1726853643.12133: checking for max_fail_percentage 28023 1726853643.12135: done checking for max_fail_percentage 28023 1726853643.12135: checking to see if all hosts have failed and the running result is not ok 28023 1726853643.12136: done checking to see if all hosts have failed 28023 1726853643.12137: getting the remaining hosts for this loop 28023 1726853643.12139: done getting the remaining hosts for this loop 28023 1726853643.12142: getting the next task for host managed_node3 28023 1726853643.12151: done getting next task for host managed_node3 28023 1726853643.12154: ^ task is: TASK: Assert interface1 is absent 28023 1726853643.12159: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853643.12164: getting variables 28023 1726853643.12165: in VariableManager get_vars() 28023 1726853643.12208: Calling all_inventory to load vars for managed_node3 28023 1726853643.12211: Calling groups_inventory to load vars for managed_node3 28023 1726853643.12213: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853643.12224: Calling all_plugins_play to load vars for managed_node3 28023 1726853643.12226: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853643.12229: Calling groups_plugins_play to load vars for managed_node3 28023 1726853643.13584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853643.15268: done with get_vars() 28023 1726853643.15294: done getting variables TASK [Assert interface1 is absent] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:153 Friday 20 September 2024 13:34:03 -0400 (0:00:00.426) 0:00:35.237 ****** 28023 1726853643.15396: entering _queue_task() for managed_node3/include_tasks 28023 1726853643.15992: worker is 1 (out of 1 available) 28023 1726853643.16004: exiting _queue_task() for managed_node3/include_tasks 28023 1726853643.16015: done queuing things up, now waiting for results queue to drain 28023 1726853643.16016: waiting for pending results... 28023 1726853643.16147: running TaskExecutor() for managed_node3/TASK: Assert interface1 is absent 28023 1726853643.16355: in run() - task 02083763-bbaf-fdb6-dad7-0000000000b6 28023 1726853643.16362: variable 'ansible_search_path' from source: unknown 28023 1726853643.16366: calling self._execute() 28023 1726853643.16439: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853643.16460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853643.16481: variable 'omit' from source: magic vars 28023 1726853643.16900: variable 'ansible_distribution_major_version' from source: facts 28023 1726853643.16919: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853643.16930: _execute() done 28023 1726853643.16939: dumping result to json 28023 1726853643.16948: done dumping result, returning 28023 1726853643.16960: done running TaskExecutor() for managed_node3/TASK: Assert interface1 is absent [02083763-bbaf-fdb6-dad7-0000000000b6] 28023 1726853643.16973: sending task result for task 02083763-bbaf-fdb6-dad7-0000000000b6 28023 1726853643.17138: no more pending results, returning what we have 28023 1726853643.17143: in VariableManager get_vars() 28023 1726853643.17199: Calling all_inventory to load vars for managed_node3 28023 1726853643.17203: Calling groups_inventory to load vars for managed_node3 28023 1726853643.17205: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853643.17220: Calling all_plugins_play to load vars for managed_node3 28023 1726853643.17224: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853643.17227: Calling groups_plugins_play to load vars for managed_node3 28023 1726853643.18086: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000000b6 28023 1726853643.18090: WORKER PROCESS EXITING 28023 1726853643.19118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853643.20484: done with get_vars() 28023 1726853643.20497: variable 'ansible_search_path' from source: unknown 28023 1726853643.20508: we have included files to process 28023 1726853643.20509: generating all_blocks data 28023 1726853643.20510: done generating all_blocks data 28023 1726853643.20514: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28023 1726853643.20514: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28023 1726853643.20516: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28023 1726853643.20624: in VariableManager get_vars() 28023 1726853643.20641: done with get_vars() 28023 1726853643.20718: done processing included file 28023 1726853643.20720: iterating over new_blocks loaded from include file 28023 1726853643.20721: in VariableManager get_vars() 28023 1726853643.20733: done with get_vars() 28023 1726853643.20734: filtering new block on tags 28023 1726853643.20757: done filtering new block on tags 28023 1726853643.20759: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node3 28023 1726853643.20763: extending task lists for all hosts with included blocks 28023 1726853643.21478: done extending task lists 28023 1726853643.21480: done processing included files 28023 1726853643.21480: results queue empty 28023 1726853643.21481: checking for any_errors_fatal 28023 1726853643.21485: done checking for any_errors_fatal 28023 1726853643.21485: checking for max_fail_percentage 28023 1726853643.21486: done checking for max_fail_percentage 28023 1726853643.21486: checking to see if all hosts have failed and the running result is not ok 28023 1726853643.21487: done checking to see if all hosts have failed 28023 1726853643.21488: getting the remaining hosts for this loop 28023 1726853643.21488: done getting the remaining hosts for this loop 28023 1726853643.21490: getting the next task for host managed_node3 28023 1726853643.21492: done getting next task for host managed_node3 28023 1726853643.21494: ^ task is: TASK: Include the task 'get_interface_stat.yml' 28023 1726853643.21496: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853643.21498: getting variables 28023 1726853643.21498: in VariableManager get_vars() 28023 1726853643.21509: Calling all_inventory to load vars for managed_node3 28023 1726853643.21511: Calling groups_inventory to load vars for managed_node3 28023 1726853643.21512: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853643.21516: Calling all_plugins_play to load vars for managed_node3 28023 1726853643.21517: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853643.21519: Calling groups_plugins_play to load vars for managed_node3 28023 1726853643.22396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853643.23509: done with get_vars() 28023 1726853643.23526: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 13:34:03 -0400 (0:00:00.081) 0:00:35.319 ****** 28023 1726853643.23581: entering _queue_task() for managed_node3/include_tasks 28023 1726853643.23838: worker is 1 (out of 1 available) 28023 1726853643.23850: exiting _queue_task() for managed_node3/include_tasks 28023 1726853643.23862: done queuing things up, now waiting for results queue to drain 28023 1726853643.23864: waiting for pending results... 28023 1726853643.24046: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 28023 1726853643.24122: in run() - task 02083763-bbaf-fdb6-dad7-000000000816 28023 1726853643.24132: variable 'ansible_search_path' from source: unknown 28023 1726853643.24135: variable 'ansible_search_path' from source: unknown 28023 1726853643.24167: calling self._execute() 28023 1726853643.24248: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853643.24252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853643.24263: variable 'omit' from source: magic vars 28023 1726853643.24543: variable 'ansible_distribution_major_version' from source: facts 28023 1726853643.24554: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853643.24562: _execute() done 28023 1726853643.24565: dumping result to json 28023 1726853643.24572: done dumping result, returning 28023 1726853643.24578: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-fdb6-dad7-000000000816] 28023 1726853643.24583: sending task result for task 02083763-bbaf-fdb6-dad7-000000000816 28023 1726853643.24666: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000816 28023 1726853643.24669: WORKER PROCESS EXITING 28023 1726853643.24697: no more pending results, returning what we have 28023 1726853643.24702: in VariableManager get_vars() 28023 1726853643.24749: Calling all_inventory to load vars for managed_node3 28023 1726853643.24752: Calling groups_inventory to load vars for managed_node3 28023 1726853643.24755: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853643.24769: Calling all_plugins_play to load vars for managed_node3 28023 1726853643.24778: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853643.24782: Calling groups_plugins_play to load vars for managed_node3 28023 1726853643.26220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853643.27239: done with get_vars() 28023 1726853643.27254: variable 'ansible_search_path' from source: unknown 28023 1726853643.27255: variable 'ansible_search_path' from source: unknown 28023 1726853643.27285: we have included files to process 28023 1726853643.27286: generating all_blocks data 28023 1726853643.27287: done generating all_blocks data 28023 1726853643.27288: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28023 1726853643.27288: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28023 1726853643.27290: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28023 1726853643.27415: done processing included file 28023 1726853643.27417: iterating over new_blocks loaded from include file 28023 1726853643.27418: in VariableManager get_vars() 28023 1726853643.27431: done with get_vars() 28023 1726853643.27432: filtering new block on tags 28023 1726853643.27449: done filtering new block on tags 28023 1726853643.27450: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 28023 1726853643.27454: extending task lists for all hosts with included blocks 28023 1726853643.27527: done extending task lists 28023 1726853643.27528: done processing included files 28023 1726853643.27528: results queue empty 28023 1726853643.27529: checking for any_errors_fatal 28023 1726853643.27531: done checking for any_errors_fatal 28023 1726853643.27532: checking for max_fail_percentage 28023 1726853643.27532: done checking for max_fail_percentage 28023 1726853643.27533: checking to see if all hosts have failed and the running result is not ok 28023 1726853643.27533: done checking to see if all hosts have failed 28023 1726853643.27534: getting the remaining hosts for this loop 28023 1726853643.27535: done getting the remaining hosts for this loop 28023 1726853643.27536: getting the next task for host managed_node3 28023 1726853643.27539: done getting next task for host managed_node3 28023 1726853643.27540: ^ task is: TASK: Get stat for interface {{ interface }} 28023 1726853643.27543: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853643.27544: getting variables 28023 1726853643.27545: in VariableManager get_vars() 28023 1726853643.27555: Calling all_inventory to load vars for managed_node3 28023 1726853643.27556: Calling groups_inventory to load vars for managed_node3 28023 1726853643.27558: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853643.27562: Calling all_plugins_play to load vars for managed_node3 28023 1726853643.27564: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853643.27565: Calling groups_plugins_play to load vars for managed_node3 28023 1726853643.28268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853643.29759: done with get_vars() 28023 1726853643.29782: done getting variables 28023 1726853643.29946: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest1] ***************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:34:03 -0400 (0:00:00.063) 0:00:35.383 ****** 28023 1726853643.29984: entering _queue_task() for managed_node3/stat 28023 1726853643.30341: worker is 1 (out of 1 available) 28023 1726853643.30354: exiting _queue_task() for managed_node3/stat 28023 1726853643.30369: done queuing things up, now waiting for results queue to drain 28023 1726853643.30372: waiting for pending results... 28023 1726853643.30735: running TaskExecutor() for managed_node3/TASK: Get stat for interface ethtest1 28023 1726853643.30829: in run() - task 02083763-bbaf-fdb6-dad7-0000000008bc 28023 1726853643.30834: variable 'ansible_search_path' from source: unknown 28023 1726853643.30838: variable 'ansible_search_path' from source: unknown 28023 1726853643.30872: calling self._execute() 28023 1726853643.30953: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853643.30957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853643.30967: variable 'omit' from source: magic vars 28023 1726853643.31233: variable 'ansible_distribution_major_version' from source: facts 28023 1726853643.31244: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853643.31250: variable 'omit' from source: magic vars 28023 1726853643.31296: variable 'omit' from source: magic vars 28023 1726853643.31364: variable 'interface' from source: set_fact 28023 1726853643.31382: variable 'omit' from source: magic vars 28023 1726853643.31414: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853643.31442: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853643.31459: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853643.31479: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853643.31488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853643.31510: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853643.31513: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853643.31516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853643.31587: Set connection var ansible_shell_type to sh 28023 1726853643.31591: Set connection var ansible_shell_executable to /bin/sh 28023 1726853643.31596: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853643.31601: Set connection var ansible_connection to ssh 28023 1726853643.31607: Set connection var ansible_pipelining to False 28023 1726853643.31612: Set connection var ansible_timeout to 10 28023 1726853643.31632: variable 'ansible_shell_executable' from source: unknown 28023 1726853643.31636: variable 'ansible_connection' from source: unknown 28023 1726853643.31638: variable 'ansible_module_compression' from source: unknown 28023 1726853643.31640: variable 'ansible_shell_type' from source: unknown 28023 1726853643.31643: variable 'ansible_shell_executable' from source: unknown 28023 1726853643.31645: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853643.31647: variable 'ansible_pipelining' from source: unknown 28023 1726853643.31651: variable 'ansible_timeout' from source: unknown 28023 1726853643.31654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853643.31807: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28023 1726853643.31818: variable 'omit' from source: magic vars 28023 1726853643.31823: starting attempt loop 28023 1726853643.31826: running the handler 28023 1726853643.31838: _low_level_execute_command(): starting 28023 1726853643.31844: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853643.32335: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853643.32370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853643.32377: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853643.32379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853643.32413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853643.32429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853643.32525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853643.34226: stdout chunk (state=3): >>>/root <<< 28023 1726853643.34328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853643.34352: stderr chunk (state=3): >>><<< 28023 1726853643.34355: stdout chunk (state=3): >>><<< 28023 1726853643.34380: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853643.34391: _low_level_execute_command(): starting 28023 1726853643.34398: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853643.3437986-29610-272889690337475 `" && echo ansible-tmp-1726853643.3437986-29610-272889690337475="` echo /root/.ansible/tmp/ansible-tmp-1726853643.3437986-29610-272889690337475 `" ) && sleep 0' 28023 1726853643.34845: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853643.34848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853643.34850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28023 1726853643.34862: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853643.34865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853643.34909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853643.34913: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853643.34983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853643.36952: stdout chunk (state=3): >>>ansible-tmp-1726853643.3437986-29610-272889690337475=/root/.ansible/tmp/ansible-tmp-1726853643.3437986-29610-272889690337475 <<< 28023 1726853643.37047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853643.37078: stderr chunk (state=3): >>><<< 28023 1726853643.37082: stdout chunk (state=3): >>><<< 28023 1726853643.37097: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853643.3437986-29610-272889690337475=/root/.ansible/tmp/ansible-tmp-1726853643.3437986-29610-272889690337475 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853643.37140: variable 'ansible_module_compression' from source: unknown 28023 1726853643.37188: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28023 1726853643.37219: variable 'ansible_facts' from source: unknown 28023 1726853643.37285: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853643.3437986-29610-272889690337475/AnsiballZ_stat.py 28023 1726853643.37393: Sending initial data 28023 1726853643.37396: Sent initial data (153 bytes) 28023 1726853643.37838: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853643.37841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853643.37843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853643.37846: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853643.37848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853643.37907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853643.37910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853643.37965: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853643.39596: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28023 1726853643.39601: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853643.39650: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853643.39707: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmps3sro71m /root/.ansible/tmp/ansible-tmp-1726853643.3437986-29610-272889690337475/AnsiballZ_stat.py <<< 28023 1726853643.39712: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853643.3437986-29610-272889690337475/AnsiballZ_stat.py" <<< 28023 1726853643.39762: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmps3sro71m" to remote "/root/.ansible/tmp/ansible-tmp-1726853643.3437986-29610-272889690337475/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853643.3437986-29610-272889690337475/AnsiballZ_stat.py" <<< 28023 1726853643.40359: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853643.40403: stderr chunk (state=3): >>><<< 28023 1726853643.40407: stdout chunk (state=3): >>><<< 28023 1726853643.40425: done transferring module to remote 28023 1726853643.40434: _low_level_execute_command(): starting 28023 1726853643.40438: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853643.3437986-29610-272889690337475/ /root/.ansible/tmp/ansible-tmp-1726853643.3437986-29610-272889690337475/AnsiballZ_stat.py && sleep 0' 28023 1726853643.40862: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853643.40866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853643.40896: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853643.40899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853643.40901: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853643.40907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853643.40956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853643.40965: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853643.41025: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853643.42877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853643.42905: stderr chunk (state=3): >>><<< 28023 1726853643.42909: stdout chunk (state=3): >>><<< 28023 1726853643.42922: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853643.42924: _low_level_execute_command(): starting 28023 1726853643.42930: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853643.3437986-29610-272889690337475/AnsiballZ_stat.py && sleep 0' 28023 1726853643.43361: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853643.43364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853643.43396: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853643.43399: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853643.43401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853643.43403: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853643.43462: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853643.43468: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853643.43470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853643.43532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853643.59139: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28023 1726853643.60509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853643.60534: stderr chunk (state=3): >>><<< 28023 1726853643.60537: stdout chunk (state=3): >>><<< 28023 1726853643.60552: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853643.60581: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853643.3437986-29610-272889690337475/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853643.60589: _low_level_execute_command(): starting 28023 1726853643.60595: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853643.3437986-29610-272889690337475/ > /dev/null 2>&1 && sleep 0' 28023 1726853643.61059: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853643.61062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853643.61065: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853643.61067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853643.61116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853643.61119: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853643.61125: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853643.61186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853643.63060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853643.63089: stderr chunk (state=3): >>><<< 28023 1726853643.63094: stdout chunk (state=3): >>><<< 28023 1726853643.63114: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853643.63119: handler run complete 28023 1726853643.63134: attempt loop complete, returning result 28023 1726853643.63137: _execute() done 28023 1726853643.63140: dumping result to json 28023 1726853643.63142: done dumping result, returning 28023 1726853643.63150: done running TaskExecutor() for managed_node3/TASK: Get stat for interface ethtest1 [02083763-bbaf-fdb6-dad7-0000000008bc] 28023 1726853643.63155: sending task result for task 02083763-bbaf-fdb6-dad7-0000000008bc 28023 1726853643.63249: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000008bc 28023 1726853643.63251: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 28023 1726853643.63312: no more pending results, returning what we have 28023 1726853643.63315: results queue empty 28023 1726853643.63316: checking for any_errors_fatal 28023 1726853643.63317: done checking for any_errors_fatal 28023 1726853643.63318: checking for max_fail_percentage 28023 1726853643.63319: done checking for max_fail_percentage 28023 1726853643.63320: checking to see if all hosts have failed and the running result is not ok 28023 1726853643.63321: done checking to see if all hosts have failed 28023 1726853643.63322: getting the remaining hosts for this loop 28023 1726853643.63323: done getting the remaining hosts for this loop 28023 1726853643.63326: getting the next task for host managed_node3 28023 1726853643.63334: done getting next task for host managed_node3 28023 1726853643.63337: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 28023 1726853643.63341: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853643.63346: getting variables 28023 1726853643.63348: in VariableManager get_vars() 28023 1726853643.63395: Calling all_inventory to load vars for managed_node3 28023 1726853643.63398: Calling groups_inventory to load vars for managed_node3 28023 1726853643.63400: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853643.63412: Calling all_plugins_play to load vars for managed_node3 28023 1726853643.63414: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853643.63417: Calling groups_plugins_play to load vars for managed_node3 28023 1726853643.64240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853643.65120: done with get_vars() 28023 1726853643.65136: done getting variables 28023 1726853643.65183: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853643.65270: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest1'] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 13:34:03 -0400 (0:00:00.353) 0:00:35.736 ****** 28023 1726853643.65295: entering _queue_task() for managed_node3/assert 28023 1726853643.65539: worker is 1 (out of 1 available) 28023 1726853643.65550: exiting _queue_task() for managed_node3/assert 28023 1726853643.65566: done queuing things up, now waiting for results queue to drain 28023 1726853643.65568: waiting for pending results... 28023 1726853643.65741: running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'ethtest1' 28023 1726853643.65813: in run() - task 02083763-bbaf-fdb6-dad7-000000000817 28023 1726853643.65824: variable 'ansible_search_path' from source: unknown 28023 1726853643.65828: variable 'ansible_search_path' from source: unknown 28023 1726853643.65860: calling self._execute() 28023 1726853643.65940: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853643.65944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853643.65952: variable 'omit' from source: magic vars 28023 1726853643.66219: variable 'ansible_distribution_major_version' from source: facts 28023 1726853643.66230: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853643.66243: variable 'omit' from source: magic vars 28023 1726853643.66277: variable 'omit' from source: magic vars 28023 1726853643.66345: variable 'interface' from source: set_fact 28023 1726853643.66363: variable 'omit' from source: magic vars 28023 1726853643.66398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853643.66425: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853643.66444: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853643.66459: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853643.66472: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853643.66496: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853643.66499: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853643.66502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853643.66570: Set connection var ansible_shell_type to sh 28023 1726853643.66578: Set connection var ansible_shell_executable to /bin/sh 28023 1726853643.66584: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853643.66589: Set connection var ansible_connection to ssh 28023 1726853643.66594: Set connection var ansible_pipelining to False 28023 1726853643.66600: Set connection var ansible_timeout to 10 28023 1726853643.66620: variable 'ansible_shell_executable' from source: unknown 28023 1726853643.66623: variable 'ansible_connection' from source: unknown 28023 1726853643.66626: variable 'ansible_module_compression' from source: unknown 28023 1726853643.66629: variable 'ansible_shell_type' from source: unknown 28023 1726853643.66631: variable 'ansible_shell_executable' from source: unknown 28023 1726853643.66634: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853643.66636: variable 'ansible_pipelining' from source: unknown 28023 1726853643.66639: variable 'ansible_timeout' from source: unknown 28023 1726853643.66643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853643.66744: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853643.66753: variable 'omit' from source: magic vars 28023 1726853643.66759: starting attempt loop 28023 1726853643.66764: running the handler 28023 1726853643.66862: variable 'interface_stat' from source: set_fact 28023 1726853643.66873: Evaluated conditional (not interface_stat.stat.exists): True 28023 1726853643.66878: handler run complete 28023 1726853643.66893: attempt loop complete, returning result 28023 1726853643.66896: _execute() done 28023 1726853643.66899: dumping result to json 28023 1726853643.66901: done dumping result, returning 28023 1726853643.66907: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'ethtest1' [02083763-bbaf-fdb6-dad7-000000000817] 28023 1726853643.66911: sending task result for task 02083763-bbaf-fdb6-dad7-000000000817 28023 1726853643.66990: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000817 28023 1726853643.66992: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 28023 1726853643.67038: no more pending results, returning what we have 28023 1726853643.67041: results queue empty 28023 1726853643.67042: checking for any_errors_fatal 28023 1726853643.67053: done checking for any_errors_fatal 28023 1726853643.67054: checking for max_fail_percentage 28023 1726853643.67056: done checking for max_fail_percentage 28023 1726853643.67057: checking to see if all hosts have failed and the running result is not ok 28023 1726853643.67058: done checking to see if all hosts have failed 28023 1726853643.67059: getting the remaining hosts for this loop 28023 1726853643.67060: done getting the remaining hosts for this loop 28023 1726853643.67063: getting the next task for host managed_node3 28023 1726853643.67073: done getting next task for host managed_node3 28023 1726853643.67077: ^ task is: TASK: Set interface0 28023 1726853643.67080: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853643.67084: getting variables 28023 1726853643.67085: in VariableManager get_vars() 28023 1726853643.67131: Calling all_inventory to load vars for managed_node3 28023 1726853643.67133: Calling groups_inventory to load vars for managed_node3 28023 1726853643.67135: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853643.67145: Calling all_plugins_play to load vars for managed_node3 28023 1726853643.67148: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853643.67151: Calling groups_plugins_play to load vars for managed_node3 28023 1726853643.68083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853643.68930: done with get_vars() 28023 1726853643.68948: done getting variables 28023 1726853643.68993: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set interface0] ********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:155 Friday 20 September 2024 13:34:03 -0400 (0:00:00.037) 0:00:35.774 ****** 28023 1726853643.69014: entering _queue_task() for managed_node3/set_fact 28023 1726853643.69261: worker is 1 (out of 1 available) 28023 1726853643.69276: exiting _queue_task() for managed_node3/set_fact 28023 1726853643.69290: done queuing things up, now waiting for results queue to drain 28023 1726853643.69292: waiting for pending results... 28023 1726853643.69474: running TaskExecutor() for managed_node3/TASK: Set interface0 28023 1726853643.69536: in run() - task 02083763-bbaf-fdb6-dad7-0000000000b7 28023 1726853643.69548: variable 'ansible_search_path' from source: unknown 28023 1726853643.69581: calling self._execute() 28023 1726853643.69664: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853643.69669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853643.69679: variable 'omit' from source: magic vars 28023 1726853643.69945: variable 'ansible_distribution_major_version' from source: facts 28023 1726853643.69958: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853643.69964: variable 'omit' from source: magic vars 28023 1726853643.69992: variable 'omit' from source: magic vars 28023 1726853643.70012: variable 'interface0' from source: play vars 28023 1726853643.70069: variable 'interface0' from source: play vars 28023 1726853643.70083: variable 'omit' from source: magic vars 28023 1726853643.70115: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853643.70144: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853643.70163: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853643.70181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853643.70191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853643.70213: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853643.70216: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853643.70219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853643.70289: Set connection var ansible_shell_type to sh 28023 1726853643.70294: Set connection var ansible_shell_executable to /bin/sh 28023 1726853643.70300: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853643.70305: Set connection var ansible_connection to ssh 28023 1726853643.70310: Set connection var ansible_pipelining to False 28023 1726853643.70315: Set connection var ansible_timeout to 10 28023 1726853643.70334: variable 'ansible_shell_executable' from source: unknown 28023 1726853643.70338: variable 'ansible_connection' from source: unknown 28023 1726853643.70340: variable 'ansible_module_compression' from source: unknown 28023 1726853643.70343: variable 'ansible_shell_type' from source: unknown 28023 1726853643.70345: variable 'ansible_shell_executable' from source: unknown 28023 1726853643.70347: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853643.70350: variable 'ansible_pipelining' from source: unknown 28023 1726853643.70352: variable 'ansible_timeout' from source: unknown 28023 1726853643.70356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853643.70458: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853643.70472: variable 'omit' from source: magic vars 28023 1726853643.70477: starting attempt loop 28023 1726853643.70479: running the handler 28023 1726853643.70491: handler run complete 28023 1726853643.70504: attempt loop complete, returning result 28023 1726853643.70507: _execute() done 28023 1726853643.70509: dumping result to json 28023 1726853643.70512: done dumping result, returning 28023 1726853643.70515: done running TaskExecutor() for managed_node3/TASK: Set interface0 [02083763-bbaf-fdb6-dad7-0000000000b7] 28023 1726853643.70517: sending task result for task 02083763-bbaf-fdb6-dad7-0000000000b7 28023 1726853643.70592: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000000b7 28023 1726853643.70594: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "interface": "ethtest0" }, "changed": false } 28023 1726853643.70649: no more pending results, returning what we have 28023 1726853643.70651: results queue empty 28023 1726853643.70652: checking for any_errors_fatal 28023 1726853643.70658: done checking for any_errors_fatal 28023 1726853643.70659: checking for max_fail_percentage 28023 1726853643.70661: done checking for max_fail_percentage 28023 1726853643.70662: checking to see if all hosts have failed and the running result is not ok 28023 1726853643.70663: done checking to see if all hosts have failed 28023 1726853643.70663: getting the remaining hosts for this loop 28023 1726853643.70665: done getting the remaining hosts for this loop 28023 1726853643.70668: getting the next task for host managed_node3 28023 1726853643.70677: done getting next task for host managed_node3 28023 1726853643.70680: ^ task is: TASK: Delete interface0 28023 1726853643.70683: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853643.70688: getting variables 28023 1726853643.70689: in VariableManager get_vars() 28023 1726853643.70728: Calling all_inventory to load vars for managed_node3 28023 1726853643.70731: Calling groups_inventory to load vars for managed_node3 28023 1726853643.70733: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853643.70743: Calling all_plugins_play to load vars for managed_node3 28023 1726853643.70745: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853643.70747: Calling groups_plugins_play to load vars for managed_node3 28023 1726853643.71533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853643.72475: done with get_vars() 28023 1726853643.72490: done getting variables TASK [Delete interface0] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:158 Friday 20 September 2024 13:34:03 -0400 (0:00:00.035) 0:00:35.809 ****** 28023 1726853643.72554: entering _queue_task() for managed_node3/include_tasks 28023 1726853643.72779: worker is 1 (out of 1 available) 28023 1726853643.72792: exiting _queue_task() for managed_node3/include_tasks 28023 1726853643.72806: done queuing things up, now waiting for results queue to drain 28023 1726853643.72807: waiting for pending results... 28023 1726853643.72985: running TaskExecutor() for managed_node3/TASK: Delete interface0 28023 1726853643.73052: in run() - task 02083763-bbaf-fdb6-dad7-0000000000b8 28023 1726853643.73067: variable 'ansible_search_path' from source: unknown 28023 1726853643.73097: calling self._execute() 28023 1726853643.73176: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853643.73182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853643.73190: variable 'omit' from source: magic vars 28023 1726853643.73450: variable 'ansible_distribution_major_version' from source: facts 28023 1726853643.73460: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853643.73475: _execute() done 28023 1726853643.73479: dumping result to json 28023 1726853643.73482: done dumping result, returning 28023 1726853643.73485: done running TaskExecutor() for managed_node3/TASK: Delete interface0 [02083763-bbaf-fdb6-dad7-0000000000b8] 28023 1726853643.73487: sending task result for task 02083763-bbaf-fdb6-dad7-0000000000b8 28023 1726853643.73563: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000000b8 28023 1726853643.73566: WORKER PROCESS EXITING 28023 1726853643.73596: no more pending results, returning what we have 28023 1726853643.73600: in VariableManager get_vars() 28023 1726853643.73646: Calling all_inventory to load vars for managed_node3 28023 1726853643.73649: Calling groups_inventory to load vars for managed_node3 28023 1726853643.73651: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853643.73664: Calling all_plugins_play to load vars for managed_node3 28023 1726853643.73667: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853643.73670: Calling groups_plugins_play to load vars for managed_node3 28023 1726853643.75055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853643.76113: done with get_vars() 28023 1726853643.76128: variable 'ansible_search_path' from source: unknown 28023 1726853643.76139: we have included files to process 28023 1726853643.76140: generating all_blocks data 28023 1726853643.76140: done generating all_blocks data 28023 1726853643.76143: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 28023 1726853643.76144: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 28023 1726853643.76145: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 28023 1726853643.76272: done processing included file 28023 1726853643.76274: iterating over new_blocks loaded from include file 28023 1726853643.76275: in VariableManager get_vars() 28023 1726853643.76288: done with get_vars() 28023 1726853643.76289: filtering new block on tags 28023 1726853643.76305: done filtering new block on tags 28023 1726853643.76306: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node3 28023 1726853643.76310: extending task lists for all hosts with included blocks 28023 1726853643.77084: done extending task lists 28023 1726853643.77086: done processing included files 28023 1726853643.77086: results queue empty 28023 1726853643.77087: checking for any_errors_fatal 28023 1726853643.77089: done checking for any_errors_fatal 28023 1726853643.77089: checking for max_fail_percentage 28023 1726853643.77090: done checking for max_fail_percentage 28023 1726853643.77090: checking to see if all hosts have failed and the running result is not ok 28023 1726853643.77091: done checking to see if all hosts have failed 28023 1726853643.77092: getting the remaining hosts for this loop 28023 1726853643.77092: done getting the remaining hosts for this loop 28023 1726853643.77094: getting the next task for host managed_node3 28023 1726853643.77096: done getting next task for host managed_node3 28023 1726853643.77098: ^ task is: TASK: Remove test interface if necessary 28023 1726853643.77100: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853643.77102: getting variables 28023 1726853643.77102: in VariableManager get_vars() 28023 1726853643.77112: Calling all_inventory to load vars for managed_node3 28023 1726853643.77113: Calling groups_inventory to load vars for managed_node3 28023 1726853643.77114: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853643.77119: Calling all_plugins_play to load vars for managed_node3 28023 1726853643.77121: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853643.77124: Calling groups_plugins_play to load vars for managed_node3 28023 1726853643.78335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853643.79964: done with get_vars() 28023 1726853643.79985: done getting variables 28023 1726853643.80025: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 13:34:03 -0400 (0:00:00.074) 0:00:35.884 ****** 28023 1726853643.80054: entering _queue_task() for managed_node3/command 28023 1726853643.80388: worker is 1 (out of 1 available) 28023 1726853643.80403: exiting _queue_task() for managed_node3/command 28023 1726853643.80417: done queuing things up, now waiting for results queue to drain 28023 1726853643.80418: waiting for pending results... 28023 1726853643.81017: running TaskExecutor() for managed_node3/TASK: Remove test interface if necessary 28023 1726853643.81030: in run() - task 02083763-bbaf-fdb6-dad7-0000000008da 28023 1726853643.81159: variable 'ansible_search_path' from source: unknown 28023 1726853643.81163: variable 'ansible_search_path' from source: unknown 28023 1726853643.81201: calling self._execute() 28023 1726853643.81413: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853643.81416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853643.81427: variable 'omit' from source: magic vars 28023 1726853643.82235: variable 'ansible_distribution_major_version' from source: facts 28023 1726853643.82247: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853643.82253: variable 'omit' from source: magic vars 28023 1726853643.82449: variable 'omit' from source: magic vars 28023 1726853643.82735: variable 'interface' from source: set_fact 28023 1726853643.82739: variable 'omit' from source: magic vars 28023 1726853643.82741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853643.82873: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853643.82893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853643.82911: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853643.82923: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853643.82951: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853643.83068: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853643.83073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853643.83166: Set connection var ansible_shell_type to sh 28023 1726853643.83287: Set connection var ansible_shell_executable to /bin/sh 28023 1726853643.83294: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853643.83299: Set connection var ansible_connection to ssh 28023 1726853643.83305: Set connection var ansible_pipelining to False 28023 1726853643.83310: Set connection var ansible_timeout to 10 28023 1726853643.83340: variable 'ansible_shell_executable' from source: unknown 28023 1726853643.83344: variable 'ansible_connection' from source: unknown 28023 1726853643.83346: variable 'ansible_module_compression' from source: unknown 28023 1726853643.83348: variable 'ansible_shell_type' from source: unknown 28023 1726853643.83351: variable 'ansible_shell_executable' from source: unknown 28023 1726853643.83353: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853643.83357: variable 'ansible_pipelining' from source: unknown 28023 1726853643.83364: variable 'ansible_timeout' from source: unknown 28023 1726853643.83386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853643.83649: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853643.83663: variable 'omit' from source: magic vars 28023 1726853643.83711: starting attempt loop 28023 1726853643.83715: running the handler 28023 1726853643.83717: _low_level_execute_command(): starting 28023 1726853643.83719: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853643.85242: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853643.85255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853643.85259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853643.85263: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853643.85344: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853643.85348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853643.85412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853643.87129: stdout chunk (state=3): >>>/root <<< 28023 1726853643.87294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853643.87345: stderr chunk (state=3): >>><<< 28023 1726853643.87358: stdout chunk (state=3): >>><<< 28023 1726853643.87421: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853643.87678: _low_level_execute_command(): starting 28023 1726853643.87682: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853643.8757887-29623-113121175929448 `" && echo ansible-tmp-1726853643.8757887-29623-113121175929448="` echo /root/.ansible/tmp/ansible-tmp-1726853643.8757887-29623-113121175929448 `" ) && sleep 0' 28023 1726853643.88797: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853643.88874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853643.88898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853643.89090: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853643.89176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853643.91196: stdout chunk (state=3): >>>ansible-tmp-1726853643.8757887-29623-113121175929448=/root/.ansible/tmp/ansible-tmp-1726853643.8757887-29623-113121175929448 <<< 28023 1726853643.91488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853643.91882: stderr chunk (state=3): >>><<< 28023 1726853643.91885: stdout chunk (state=3): >>><<< 28023 1726853643.91887: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853643.8757887-29623-113121175929448=/root/.ansible/tmp/ansible-tmp-1726853643.8757887-29623-113121175929448 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853643.91890: variable 'ansible_module_compression' from source: unknown 28023 1726853643.91892: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28023 1726853643.91894: variable 'ansible_facts' from source: unknown 28023 1726853643.92048: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853643.8757887-29623-113121175929448/AnsiballZ_command.py 28023 1726853643.92352: Sending initial data 28023 1726853643.92362: Sent initial data (156 bytes) 28023 1726853643.93586: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853643.93682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853643.93697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853643.93762: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853643.93876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853643.95823: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853643.95883: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853643.96280: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpmqkzz7pz /root/.ansible/tmp/ansible-tmp-1726853643.8757887-29623-113121175929448/AnsiballZ_command.py <<< 28023 1726853643.96284: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853643.8757887-29623-113121175929448/AnsiballZ_command.py" <<< 28023 1726853643.96287: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpmqkzz7pz" to remote "/root/.ansible/tmp/ansible-tmp-1726853643.8757887-29623-113121175929448/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853643.8757887-29623-113121175929448/AnsiballZ_command.py" <<< 28023 1726853643.97604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853643.97655: stderr chunk (state=3): >>><<< 28023 1726853643.97658: stdout chunk (state=3): >>><<< 28023 1726853643.97720: done transferring module to remote 28023 1726853643.97731: _low_level_execute_command(): starting 28023 1726853643.97736: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853643.8757887-29623-113121175929448/ /root/.ansible/tmp/ansible-tmp-1726853643.8757887-29623-113121175929448/AnsiballZ_command.py && sleep 0' 28023 1726853643.98978: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853643.99281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853643.99303: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853643.99399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853644.01351: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853644.01406: stderr chunk (state=3): >>><<< 28023 1726853644.01412: stdout chunk (state=3): >>><<< 28023 1726853644.01436: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853644.01445: _low_level_execute_command(): starting 28023 1726853644.01448: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853643.8757887-29623-113121175929448/AnsiballZ_command.py && sleep 0' 28023 1726853644.03085: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853644.03101: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853644.03118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853644.03250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853644.03365: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853644.03512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853644.20297: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-20 13:34:04.191607", "end": "2024-09-20 13:34:04.199913", "delta": "0:00:00.008306", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28023 1726853644.21981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853644.21994: stdout chunk (state=3): >>><<< 28023 1726853644.22007: stderr chunk (state=3): >>><<< 28023 1726853644.22042: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-20 13:34:04.191607", "end": "2024-09-20 13:34:04.199913", "delta": "0:00:00.008306", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853644.22092: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853643.8757887-29623-113121175929448/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853644.22106: _low_level_execute_command(): starting 28023 1726853644.22115: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853643.8757887-29623-113121175929448/ > /dev/null 2>&1 && sleep 0' 28023 1726853644.22794: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853644.22888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853644.22923: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853644.22940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853644.22963: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853644.23054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853644.24984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853644.24994: stdout chunk (state=3): >>><<< 28023 1726853644.25005: stderr chunk (state=3): >>><<< 28023 1726853644.25024: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853644.25177: handler run complete 28023 1726853644.25181: Evaluated conditional (False): False 28023 1726853644.25183: attempt loop complete, returning result 28023 1726853644.25185: _execute() done 28023 1726853644.25187: dumping result to json 28023 1726853644.25189: done dumping result, returning 28023 1726853644.25191: done running TaskExecutor() for managed_node3/TASK: Remove test interface if necessary [02083763-bbaf-fdb6-dad7-0000000008da] 28023 1726853644.25194: sending task result for task 02083763-bbaf-fdb6-dad7-0000000008da 28023 1726853644.25276: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000008da 28023 1726853644.25280: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest0" ], "delta": "0:00:00.008306", "end": "2024-09-20 13:34:04.199913", "rc": 0, "start": "2024-09-20 13:34:04.191607" } 28023 1726853644.25352: no more pending results, returning what we have 28023 1726853644.25356: results queue empty 28023 1726853644.25360: checking for any_errors_fatal 28023 1726853644.25362: done checking for any_errors_fatal 28023 1726853644.25362: checking for max_fail_percentage 28023 1726853644.25365: done checking for max_fail_percentage 28023 1726853644.25366: checking to see if all hosts have failed and the running result is not ok 28023 1726853644.25367: done checking to see if all hosts have failed 28023 1726853644.25368: getting the remaining hosts for this loop 28023 1726853644.25392: done getting the remaining hosts for this loop 28023 1726853644.25398: getting the next task for host managed_node3 28023 1726853644.25408: done getting next task for host managed_node3 28023 1726853644.25411: ^ task is: TASK: Assert interface0 is absent 28023 1726853644.25415: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853644.25420: getting variables 28023 1726853644.25422: in VariableManager get_vars() 28023 1726853644.25677: Calling all_inventory to load vars for managed_node3 28023 1726853644.25681: Calling groups_inventory to load vars for managed_node3 28023 1726853644.25684: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853644.25695: Calling all_plugins_play to load vars for managed_node3 28023 1726853644.25698: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853644.25700: Calling groups_plugins_play to load vars for managed_node3 28023 1726853644.27165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853644.28926: done with get_vars() 28023 1726853644.28954: done getting variables TASK [Assert interface0 is absent] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:160 Friday 20 September 2024 13:34:04 -0400 (0:00:00.490) 0:00:36.374 ****** 28023 1726853644.29063: entering _queue_task() for managed_node3/include_tasks 28023 1726853644.29427: worker is 1 (out of 1 available) 28023 1726853644.29439: exiting _queue_task() for managed_node3/include_tasks 28023 1726853644.29452: done queuing things up, now waiting for results queue to drain 28023 1726853644.29453: waiting for pending results... 28023 1726853644.29887: running TaskExecutor() for managed_node3/TASK: Assert interface0 is absent 28023 1726853644.29898: in run() - task 02083763-bbaf-fdb6-dad7-0000000000b9 28023 1726853644.29901: variable 'ansible_search_path' from source: unknown 28023 1726853644.30005: calling self._execute() 28023 1726853644.30065: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853644.30080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853644.30092: variable 'omit' from source: magic vars 28023 1726853644.30486: variable 'ansible_distribution_major_version' from source: facts 28023 1726853644.30502: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853644.30511: _execute() done 28023 1726853644.30518: dumping result to json 28023 1726853644.30524: done dumping result, returning 28023 1726853644.30532: done running TaskExecutor() for managed_node3/TASK: Assert interface0 is absent [02083763-bbaf-fdb6-dad7-0000000000b9] 28023 1726853644.30544: sending task result for task 02083763-bbaf-fdb6-dad7-0000000000b9 28023 1726853644.30735: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000000b9 28023 1726853644.30738: WORKER PROCESS EXITING 28023 1726853644.30774: no more pending results, returning what we have 28023 1726853644.30780: in VariableManager get_vars() 28023 1726853644.30836: Calling all_inventory to load vars for managed_node3 28023 1726853644.30839: Calling groups_inventory to load vars for managed_node3 28023 1726853644.30842: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853644.30863: Calling all_plugins_play to load vars for managed_node3 28023 1726853644.30867: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853644.30872: Calling groups_plugins_play to load vars for managed_node3 28023 1726853644.32756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853644.34410: done with get_vars() 28023 1726853644.34430: variable 'ansible_search_path' from source: unknown 28023 1726853644.34449: we have included files to process 28023 1726853644.34450: generating all_blocks data 28023 1726853644.34451: done generating all_blocks data 28023 1726853644.34454: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28023 1726853644.34455: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28023 1726853644.34460: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28023 1726853644.34565: in VariableManager get_vars() 28023 1726853644.34589: done with get_vars() 28023 1726853644.34698: done processing included file 28023 1726853644.34699: iterating over new_blocks loaded from include file 28023 1726853644.34701: in VariableManager get_vars() 28023 1726853644.34716: done with get_vars() 28023 1726853644.34718: filtering new block on tags 28023 1726853644.34745: done filtering new block on tags 28023 1726853644.34747: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node3 28023 1726853644.34752: extending task lists for all hosts with included blocks 28023 1726853644.36333: done extending task lists 28023 1726853644.36334: done processing included files 28023 1726853644.36335: results queue empty 28023 1726853644.36336: checking for any_errors_fatal 28023 1726853644.36341: done checking for any_errors_fatal 28023 1726853644.36342: checking for max_fail_percentage 28023 1726853644.36343: done checking for max_fail_percentage 28023 1726853644.36344: checking to see if all hosts have failed and the running result is not ok 28023 1726853644.36344: done checking to see if all hosts have failed 28023 1726853644.36345: getting the remaining hosts for this loop 28023 1726853644.36346: done getting the remaining hosts for this loop 28023 1726853644.36349: getting the next task for host managed_node3 28023 1726853644.36353: done getting next task for host managed_node3 28023 1726853644.36356: ^ task is: TASK: Include the task 'get_interface_stat.yml' 28023 1726853644.36361: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853644.36364: getting variables 28023 1726853644.36365: in VariableManager get_vars() 28023 1726853644.36383: Calling all_inventory to load vars for managed_node3 28023 1726853644.36385: Calling groups_inventory to load vars for managed_node3 28023 1726853644.36521: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853644.36527: Calling all_plugins_play to load vars for managed_node3 28023 1726853644.36530: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853644.36533: Calling groups_plugins_play to load vars for managed_node3 28023 1726853644.38785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853644.40616: done with get_vars() 28023 1726853644.40641: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 13:34:04 -0400 (0:00:00.122) 0:00:36.497 ****** 28023 1726853644.41346: entering _queue_task() for managed_node3/include_tasks 28023 1726853644.42244: worker is 1 (out of 1 available) 28023 1726853644.42261: exiting _queue_task() for managed_node3/include_tasks 28023 1726853644.42278: done queuing things up, now waiting for results queue to drain 28023 1726853644.42279: waiting for pending results... 28023 1726853644.42891: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 28023 1726853644.42977: in run() - task 02083763-bbaf-fdb6-dad7-000000000990 28023 1726853644.43004: variable 'ansible_search_path' from source: unknown 28023 1726853644.43008: variable 'ansible_search_path' from source: unknown 28023 1726853644.43103: calling self._execute() 28023 1726853644.43365: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853644.43372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853644.43422: variable 'omit' from source: magic vars 28023 1726853644.43968: variable 'ansible_distribution_major_version' from source: facts 28023 1726853644.43973: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853644.44193: _execute() done 28023 1726853644.44197: dumping result to json 28023 1726853644.44199: done dumping result, returning 28023 1726853644.44208: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-fdb6-dad7-000000000990] 28023 1726853644.44275: sending task result for task 02083763-bbaf-fdb6-dad7-000000000990 28023 1726853644.44436: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000990 28023 1726853644.44440: WORKER PROCESS EXITING 28023 1726853644.44482: no more pending results, returning what we have 28023 1726853644.44487: in VariableManager get_vars() 28023 1726853644.44649: Calling all_inventory to load vars for managed_node3 28023 1726853644.44653: Calling groups_inventory to load vars for managed_node3 28023 1726853644.44656: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853644.44670: Calling all_plugins_play to load vars for managed_node3 28023 1726853644.44674: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853644.44677: Calling groups_plugins_play to load vars for managed_node3 28023 1726853644.47678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853644.50752: done with get_vars() 28023 1726853644.50983: variable 'ansible_search_path' from source: unknown 28023 1726853644.50985: variable 'ansible_search_path' from source: unknown 28023 1726853644.51023: we have included files to process 28023 1726853644.51024: generating all_blocks data 28023 1726853644.51025: done generating all_blocks data 28023 1726853644.51027: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28023 1726853644.51028: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28023 1726853644.51030: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28023 1726853644.51419: done processing included file 28023 1726853644.51421: iterating over new_blocks loaded from include file 28023 1726853644.51422: in VariableManager get_vars() 28023 1726853644.51443: done with get_vars() 28023 1726853644.51445: filtering new block on tags 28023 1726853644.51473: done filtering new block on tags 28023 1726853644.51475: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 28023 1726853644.51481: extending task lists for all hosts with included blocks 28023 1726853644.51806: done extending task lists 28023 1726853644.51808: done processing included files 28023 1726853644.51809: results queue empty 28023 1726853644.51809: checking for any_errors_fatal 28023 1726853644.51812: done checking for any_errors_fatal 28023 1726853644.51813: checking for max_fail_percentage 28023 1726853644.51814: done checking for max_fail_percentage 28023 1726853644.51815: checking to see if all hosts have failed and the running result is not ok 28023 1726853644.51816: done checking to see if all hosts have failed 28023 1726853644.51816: getting the remaining hosts for this loop 28023 1726853644.51818: done getting the remaining hosts for this loop 28023 1726853644.51820: getting the next task for host managed_node3 28023 1726853644.51824: done getting next task for host managed_node3 28023 1726853644.51827: ^ task is: TASK: Get stat for interface {{ interface }} 28023 1726853644.51830: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853644.51832: getting variables 28023 1726853644.51833: in VariableManager get_vars() 28023 1726853644.51848: Calling all_inventory to load vars for managed_node3 28023 1726853644.51851: Calling groups_inventory to load vars for managed_node3 28023 1726853644.51853: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853644.51861: Calling all_plugins_play to load vars for managed_node3 28023 1726853644.51863: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853644.51866: Calling groups_plugins_play to load vars for managed_node3 28023 1726853644.54304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853644.57782: done with get_vars() 28023 1726853644.57806: done getting variables 28023 1726853644.58179: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:34:04 -0400 (0:00:00.168) 0:00:36.666 ****** 28023 1726853644.58213: entering _queue_task() for managed_node3/stat 28023 1726853644.59021: worker is 1 (out of 1 available) 28023 1726853644.59033: exiting _queue_task() for managed_node3/stat 28023 1726853644.59046: done queuing things up, now waiting for results queue to drain 28023 1726853644.59049: waiting for pending results... 28023 1726853644.59641: running TaskExecutor() for managed_node3/TASK: Get stat for interface ethtest0 28023 1726853644.59770: in run() - task 02083763-bbaf-fdb6-dad7-000000000a4d 28023 1726853644.60029: variable 'ansible_search_path' from source: unknown 28023 1726853644.60033: variable 'ansible_search_path' from source: unknown 28023 1726853644.60036: calling self._execute() 28023 1726853644.60119: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853644.60123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853644.60136: variable 'omit' from source: magic vars 28023 1726853644.60878: variable 'ansible_distribution_major_version' from source: facts 28023 1726853644.60893: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853644.60896: variable 'omit' from source: magic vars 28023 1726853644.61002: variable 'omit' from source: magic vars 28023 1726853644.61238: variable 'interface' from source: set_fact 28023 1726853644.61256: variable 'omit' from source: magic vars 28023 1726853644.61299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853644.61334: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853644.61353: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853644.61373: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853644.61589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853644.61654: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853644.61661: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853644.61664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853644.61715: Set connection var ansible_shell_type to sh 28023 1726853644.61722: Set connection var ansible_shell_executable to /bin/sh 28023 1726853644.61727: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853644.61765: Set connection var ansible_connection to ssh 28023 1726853644.61768: Set connection var ansible_pipelining to False 28023 1726853644.61772: Set connection var ansible_timeout to 10 28023 1726853644.61775: variable 'ansible_shell_executable' from source: unknown 28023 1726853644.61777: variable 'ansible_connection' from source: unknown 28023 1726853644.61981: variable 'ansible_module_compression' from source: unknown 28023 1726853644.61985: variable 'ansible_shell_type' from source: unknown 28023 1726853644.61987: variable 'ansible_shell_executable' from source: unknown 28023 1726853644.61990: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853644.62076: variable 'ansible_pipelining' from source: unknown 28023 1726853644.62081: variable 'ansible_timeout' from source: unknown 28023 1726853644.62084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853644.62396: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28023 1726853644.62405: variable 'omit' from source: magic vars 28023 1726853644.62416: starting attempt loop 28023 1726853644.62419: running the handler 28023 1726853644.62428: _low_level_execute_command(): starting 28023 1726853644.62437: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853644.63587: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853644.63595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853644.63789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853644.63890: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853644.63929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853644.64105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853644.65843: stdout chunk (state=3): >>>/root <<< 28023 1726853644.65968: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853644.65974: stderr chunk (state=3): >>><<< 28023 1726853644.65977: stdout chunk (state=3): >>><<< 28023 1726853644.66007: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853644.66020: _low_level_execute_command(): starting 28023 1726853644.66027: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853644.6600661-29657-260529972230886 `" && echo ansible-tmp-1726853644.6600661-29657-260529972230886="` echo /root/.ansible/tmp/ansible-tmp-1726853644.6600661-29657-260529972230886 `" ) && sleep 0' 28023 1726853644.67239: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853644.67264: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853644.67455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853644.67467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853644.67482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853644.67485: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853644.67488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853644.67695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853644.69692: stdout chunk (state=3): >>>ansible-tmp-1726853644.6600661-29657-260529972230886=/root/.ansible/tmp/ansible-tmp-1726853644.6600661-29657-260529972230886 <<< 28023 1726853644.69826: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853644.69830: stderr chunk (state=3): >>><<< 28023 1726853644.69833: stdout chunk (state=3): >>><<< 28023 1726853644.69859: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853644.6600661-29657-260529972230886=/root/.ansible/tmp/ansible-tmp-1726853644.6600661-29657-260529972230886 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853644.69916: variable 'ansible_module_compression' from source: unknown 28023 1726853644.69974: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28023 1726853644.70201: variable 'ansible_facts' from source: unknown 28023 1726853644.70360: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853644.6600661-29657-260529972230886/AnsiballZ_stat.py 28023 1726853644.70800: Sending initial data 28023 1726853644.70803: Sent initial data (153 bytes) 28023 1726853644.71486: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853644.71514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853644.71588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853644.73264: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28023 1726853644.73269: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853644.73342: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853644.73415: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmp0f7vgqg4 /root/.ansible/tmp/ansible-tmp-1726853644.6600661-29657-260529972230886/AnsiballZ_stat.py <<< 28023 1726853644.73419: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853644.6600661-29657-260529972230886/AnsiballZ_stat.py" <<< 28023 1726853644.73495: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmp0f7vgqg4" to remote "/root/.ansible/tmp/ansible-tmp-1726853644.6600661-29657-260529972230886/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853644.6600661-29657-260529972230886/AnsiballZ_stat.py" <<< 28023 1726853644.74655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853644.74661: stderr chunk (state=3): >>><<< 28023 1726853644.74664: stdout chunk (state=3): >>><<< 28023 1726853644.74743: done transferring module to remote 28023 1726853644.74799: _low_level_execute_command(): starting 28023 1726853644.74803: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853644.6600661-29657-260529972230886/ /root/.ansible/tmp/ansible-tmp-1726853644.6600661-29657-260529972230886/AnsiballZ_stat.py && sleep 0' 28023 1726853644.75590: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853644.75603: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28023 1726853644.75611: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 28023 1726853644.75668: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853644.75755: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853644.75763: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853644.75777: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853644.75819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853644.77806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853644.77810: stderr chunk (state=3): >>><<< 28023 1726853644.77812: stdout chunk (state=3): >>><<< 28023 1726853644.78004: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853644.78008: _low_level_execute_command(): starting 28023 1726853644.78010: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853644.6600661-29657-260529972230886/AnsiballZ_stat.py && sleep 0' 28023 1726853644.79022: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853644.79027: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853644.79030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853644.79119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853644.79123: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853644.79125: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853644.79128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853644.79156: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853644.79168: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853644.79181: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853644.79204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853644.79303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853644.95041: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28023 1726853644.96574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853644.96578: stdout chunk (state=3): >>><<< 28023 1726853644.96581: stderr chunk (state=3): >>><<< 28023 1726853644.96648: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853644.96652: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853644.6600661-29657-260529972230886/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853644.96656: _low_level_execute_command(): starting 28023 1726853644.96662: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853644.6600661-29657-260529972230886/ > /dev/null 2>&1 && sleep 0' 28023 1726853644.97390: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853644.97415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853644.97431: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853644.97456: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853644.97560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853644.99493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853644.99520: stderr chunk (state=3): >>><<< 28023 1726853644.99539: stdout chunk (state=3): >>><<< 28023 1726853644.99776: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853644.99779: handler run complete 28023 1726853644.99781: attempt loop complete, returning result 28023 1726853644.99783: _execute() done 28023 1726853644.99785: dumping result to json 28023 1726853644.99787: done dumping result, returning 28023 1726853644.99789: done running TaskExecutor() for managed_node3/TASK: Get stat for interface ethtest0 [02083763-bbaf-fdb6-dad7-000000000a4d] 28023 1726853644.99790: sending task result for task 02083763-bbaf-fdb6-dad7-000000000a4d 28023 1726853644.99857: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000a4d 28023 1726853644.99860: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 28023 1726853644.99924: no more pending results, returning what we have 28023 1726853644.99927: results queue empty 28023 1726853644.99928: checking for any_errors_fatal 28023 1726853644.99929: done checking for any_errors_fatal 28023 1726853644.99930: checking for max_fail_percentage 28023 1726853644.99932: done checking for max_fail_percentage 28023 1726853644.99933: checking to see if all hosts have failed and the running result is not ok 28023 1726853644.99934: done checking to see if all hosts have failed 28023 1726853644.99934: getting the remaining hosts for this loop 28023 1726853644.99936: done getting the remaining hosts for this loop 28023 1726853644.99939: getting the next task for host managed_node3 28023 1726853644.99949: done getting next task for host managed_node3 28023 1726853644.99951: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 28023 1726853644.99955: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853644.99961: getting variables 28023 1726853644.99962: in VariableManager get_vars() 28023 1726853645.00012: Calling all_inventory to load vars for managed_node3 28023 1726853645.00015: Calling groups_inventory to load vars for managed_node3 28023 1726853645.00017: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853645.00031: Calling all_plugins_play to load vars for managed_node3 28023 1726853645.00033: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853645.00036: Calling groups_plugins_play to load vars for managed_node3 28023 1726853645.01662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853645.03308: done with get_vars() 28023 1726853645.03341: done getting variables 28023 1726853645.03403: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853645.03528: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest0'] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 13:34:05 -0400 (0:00:00.453) 0:00:37.119 ****** 28023 1726853645.03567: entering _queue_task() for managed_node3/assert 28023 1726853645.03928: worker is 1 (out of 1 available) 28023 1726853645.03940: exiting _queue_task() for managed_node3/assert 28023 1726853645.03952: done queuing things up, now waiting for results queue to drain 28023 1726853645.03954: waiting for pending results... 28023 1726853645.04303: running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'ethtest0' 28023 1726853645.04404: in run() - task 02083763-bbaf-fdb6-dad7-000000000991 28023 1726853645.04433: variable 'ansible_search_path' from source: unknown 28023 1726853645.04441: variable 'ansible_search_path' from source: unknown 28023 1726853645.04641: calling self._execute() 28023 1726853645.04781: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853645.04795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853645.04810: variable 'omit' from source: magic vars 28023 1726853645.05223: variable 'ansible_distribution_major_version' from source: facts 28023 1726853645.05244: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853645.05257: variable 'omit' from source: magic vars 28023 1726853645.05325: variable 'omit' from source: magic vars 28023 1726853645.05432: variable 'interface' from source: set_fact 28023 1726853645.05457: variable 'omit' from source: magic vars 28023 1726853645.05515: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853645.05555: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853645.05584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853645.05617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853645.05635: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853645.05706: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853645.05710: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853645.05713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853645.05798: Set connection var ansible_shell_type to sh 28023 1726853645.05819: Set connection var ansible_shell_executable to /bin/sh 28023 1726853645.05834: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853645.05847: Set connection var ansible_connection to ssh 28023 1726853645.05923: Set connection var ansible_pipelining to False 28023 1726853645.05927: Set connection var ansible_timeout to 10 28023 1726853645.05930: variable 'ansible_shell_executable' from source: unknown 28023 1726853645.05933: variable 'ansible_connection' from source: unknown 28023 1726853645.05935: variable 'ansible_module_compression' from source: unknown 28023 1726853645.05939: variable 'ansible_shell_type' from source: unknown 28023 1726853645.05942: variable 'ansible_shell_executable' from source: unknown 28023 1726853645.05944: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853645.05946: variable 'ansible_pipelining' from source: unknown 28023 1726853645.05948: variable 'ansible_timeout' from source: unknown 28023 1726853645.05950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853645.06143: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853645.06147: variable 'omit' from source: magic vars 28023 1726853645.06150: starting attempt loop 28023 1726853645.06152: running the handler 28023 1726853645.06360: variable 'interface_stat' from source: set_fact 28023 1726853645.06364: Evaluated conditional (not interface_stat.stat.exists): True 28023 1726853645.06367: handler run complete 28023 1726853645.06370: attempt loop complete, returning result 28023 1726853645.06374: _execute() done 28023 1726853645.06376: dumping result to json 28023 1726853645.06378: done dumping result, returning 28023 1726853645.06380: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'ethtest0' [02083763-bbaf-fdb6-dad7-000000000991] 28023 1726853645.06384: sending task result for task 02083763-bbaf-fdb6-dad7-000000000991 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 28023 1726853645.06627: no more pending results, returning what we have 28023 1726853645.06631: results queue empty 28023 1726853645.06632: checking for any_errors_fatal 28023 1726853645.06642: done checking for any_errors_fatal 28023 1726853645.06643: checking for max_fail_percentage 28023 1726853645.06645: done checking for max_fail_percentage 28023 1726853645.06646: checking to see if all hosts have failed and the running result is not ok 28023 1726853645.06647: done checking to see if all hosts have failed 28023 1726853645.06648: getting the remaining hosts for this loop 28023 1726853645.06650: done getting the remaining hosts for this loop 28023 1726853645.06654: getting the next task for host managed_node3 28023 1726853645.06663: done getting next task for host managed_node3 28023 1726853645.06666: ^ task is: TASK: Assert interface0 profile and interface1 profile are absent 28023 1726853645.06672: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853645.06678: getting variables 28023 1726853645.06681: in VariableManager get_vars() 28023 1726853645.06726: Calling all_inventory to load vars for managed_node3 28023 1726853645.06729: Calling groups_inventory to load vars for managed_node3 28023 1726853645.06732: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853645.06746: Calling all_plugins_play to load vars for managed_node3 28023 1726853645.06749: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853645.06752: Calling groups_plugins_play to load vars for managed_node3 28023 1726853645.07285: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000991 28023 1726853645.07289: WORKER PROCESS EXITING 28023 1726853645.08729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853645.11035: done with get_vars() 28023 1726853645.11065: done getting variables TASK [Assert interface0 profile and interface1 profile are absent] ************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:162 Friday 20 September 2024 13:34:05 -0400 (0:00:00.077) 0:00:37.197 ****** 28023 1726853645.11312: entering _queue_task() for managed_node3/include_tasks 28023 1726853645.11833: worker is 1 (out of 1 available) 28023 1726853645.11860: exiting _queue_task() for managed_node3/include_tasks 28023 1726853645.11876: done queuing things up, now waiting for results queue to drain 28023 1726853645.11877: waiting for pending results... 28023 1726853645.12149: running TaskExecutor() for managed_node3/TASK: Assert interface0 profile and interface1 profile are absent 28023 1726853645.12279: in run() - task 02083763-bbaf-fdb6-dad7-0000000000ba 28023 1726853645.12297: variable 'ansible_search_path' from source: unknown 28023 1726853645.12335: variable 'interface0' from source: play vars 28023 1726853645.12499: variable 'interface0' from source: play vars 28023 1726853645.12512: variable 'interface1' from source: play vars 28023 1726853645.12560: variable 'interface1' from source: play vars 28023 1726853645.12574: variable 'omit' from source: magic vars 28023 1726853645.12686: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853645.12693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853645.12702: variable 'omit' from source: magic vars 28023 1726853645.12868: variable 'ansible_distribution_major_version' from source: facts 28023 1726853645.12878: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853645.12900: variable 'item' from source: unknown 28023 1726853645.12949: variable 'item' from source: unknown 28023 1726853645.13104: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853645.13108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853645.13111: variable 'omit' from source: magic vars 28023 1726853645.13169: variable 'ansible_distribution_major_version' from source: facts 28023 1726853645.13174: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853645.13197: variable 'item' from source: unknown 28023 1726853645.13266: variable 'item' from source: unknown 28023 1726853645.13329: dumping result to json 28023 1726853645.13332: done dumping result, returning 28023 1726853645.13335: done running TaskExecutor() for managed_node3/TASK: Assert interface0 profile and interface1 profile are absent [02083763-bbaf-fdb6-dad7-0000000000ba] 28023 1726853645.13337: sending task result for task 02083763-bbaf-fdb6-dad7-0000000000ba 28023 1726853645.13377: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000000ba 28023 1726853645.13381: WORKER PROCESS EXITING 28023 1726853645.13418: no more pending results, returning what we have 28023 1726853645.13422: in VariableManager get_vars() 28023 1726853645.13574: Calling all_inventory to load vars for managed_node3 28023 1726853645.13578: Calling groups_inventory to load vars for managed_node3 28023 1726853645.13582: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853645.13599: Calling all_plugins_play to load vars for managed_node3 28023 1726853645.13602: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853645.13605: Calling groups_plugins_play to load vars for managed_node3 28023 1726853645.14901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853645.15763: done with get_vars() 28023 1726853645.15780: variable 'ansible_search_path' from source: unknown 28023 1726853645.15792: variable 'ansible_search_path' from source: unknown 28023 1726853645.15796: we have included files to process 28023 1726853645.15797: generating all_blocks data 28023 1726853645.15798: done generating all_blocks data 28023 1726853645.15800: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 28023 1726853645.15801: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 28023 1726853645.15802: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 28023 1726853645.15913: in VariableManager get_vars() 28023 1726853645.15929: done with get_vars() 28023 1726853645.16008: done processing included file 28023 1726853645.16010: iterating over new_blocks loaded from include file 28023 1726853645.16011: in VariableManager get_vars() 28023 1726853645.16023: done with get_vars() 28023 1726853645.16024: filtering new block on tags 28023 1726853645.16043: done filtering new block on tags 28023 1726853645.16045: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node3 => (item=ethtest0) 28023 1726853645.16048: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 28023 1726853645.16049: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 28023 1726853645.16051: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 28023 1726853645.16106: in VariableManager get_vars() 28023 1726853645.16121: done with get_vars() 28023 1726853645.16183: done processing included file 28023 1726853645.16184: iterating over new_blocks loaded from include file 28023 1726853645.16185: in VariableManager get_vars() 28023 1726853645.16196: done with get_vars() 28023 1726853645.16197: filtering new block on tags 28023 1726853645.16214: done filtering new block on tags 28023 1726853645.16215: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node3 => (item=ethtest1) 28023 1726853645.16218: extending task lists for all hosts with included blocks 28023 1726853645.17690: done extending task lists 28023 1726853645.17691: done processing included files 28023 1726853645.17692: results queue empty 28023 1726853645.17693: checking for any_errors_fatal 28023 1726853645.17697: done checking for any_errors_fatal 28023 1726853645.17698: checking for max_fail_percentage 28023 1726853645.17699: done checking for max_fail_percentage 28023 1726853645.17700: checking to see if all hosts have failed and the running result is not ok 28023 1726853645.17701: done checking to see if all hosts have failed 28023 1726853645.17702: getting the remaining hosts for this loop 28023 1726853645.17703: done getting the remaining hosts for this loop 28023 1726853645.17705: getting the next task for host managed_node3 28023 1726853645.17709: done getting next task for host managed_node3 28023 1726853645.17712: ^ task is: TASK: Include the task 'get_profile_stat.yml' 28023 1726853645.17715: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853645.17718: getting variables 28023 1726853645.17719: in VariableManager get_vars() 28023 1726853645.17734: Calling all_inventory to load vars for managed_node3 28023 1726853645.17737: Calling groups_inventory to load vars for managed_node3 28023 1726853645.17745: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853645.17751: Calling all_plugins_play to load vars for managed_node3 28023 1726853645.17753: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853645.17756: Calling groups_plugins_play to load vars for managed_node3 28023 1726853645.23235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853645.24798: done with get_vars() 28023 1726853645.24824: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 13:34:05 -0400 (0:00:00.135) 0:00:37.332 ****** 28023 1726853645.24907: entering _queue_task() for managed_node3/include_tasks 28023 1726853645.25281: worker is 1 (out of 1 available) 28023 1726853645.25295: exiting _queue_task() for managed_node3/include_tasks 28023 1726853645.25309: done queuing things up, now waiting for results queue to drain 28023 1726853645.25311: waiting for pending results... 28023 1726853645.25694: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 28023 1726853645.25700: in run() - task 02083763-bbaf-fdb6-dad7-000000000a6c 28023 1726853645.25703: variable 'ansible_search_path' from source: unknown 28023 1726853645.25706: variable 'ansible_search_path' from source: unknown 28023 1726853645.25719: calling self._execute() 28023 1726853645.25833: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853645.25931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853645.25935: variable 'omit' from source: magic vars 28023 1726853645.26249: variable 'ansible_distribution_major_version' from source: facts 28023 1726853645.26262: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853645.26266: _execute() done 28023 1726853645.26270: dumping result to json 28023 1726853645.26275: done dumping result, returning 28023 1726853645.26282: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-fdb6-dad7-000000000a6c] 28023 1726853645.26287: sending task result for task 02083763-bbaf-fdb6-dad7-000000000a6c 28023 1726853645.26381: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000a6c 28023 1726853645.26383: WORKER PROCESS EXITING 28023 1726853645.26408: no more pending results, returning what we have 28023 1726853645.26413: in VariableManager get_vars() 28023 1726853645.26463: Calling all_inventory to load vars for managed_node3 28023 1726853645.26466: Calling groups_inventory to load vars for managed_node3 28023 1726853645.26468: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853645.26484: Calling all_plugins_play to load vars for managed_node3 28023 1726853645.26487: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853645.26489: Calling groups_plugins_play to load vars for managed_node3 28023 1726853645.27917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853645.29736: done with get_vars() 28023 1726853645.29756: variable 'ansible_search_path' from source: unknown 28023 1726853645.29757: variable 'ansible_search_path' from source: unknown 28023 1726853645.29797: we have included files to process 28023 1726853645.29805: generating all_blocks data 28023 1726853645.29806: done generating all_blocks data 28023 1726853645.29808: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28023 1726853645.29809: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28023 1726853645.29811: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28023 1726853645.30932: done processing included file 28023 1726853645.30935: iterating over new_blocks loaded from include file 28023 1726853645.30937: in VariableManager get_vars() 28023 1726853645.30959: done with get_vars() 28023 1726853645.30961: filtering new block on tags 28023 1726853645.31095: done filtering new block on tags 28023 1726853645.31100: in VariableManager get_vars() 28023 1726853645.31125: done with get_vars() 28023 1726853645.31126: filtering new block on tags 28023 1726853645.31182: done filtering new block on tags 28023 1726853645.31185: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 28023 1726853645.31190: extending task lists for all hosts with included blocks 28023 1726853645.31331: done extending task lists 28023 1726853645.31333: done processing included files 28023 1726853645.31334: results queue empty 28023 1726853645.31335: checking for any_errors_fatal 28023 1726853645.31339: done checking for any_errors_fatal 28023 1726853645.31340: checking for max_fail_percentage 28023 1726853645.31341: done checking for max_fail_percentage 28023 1726853645.31342: checking to see if all hosts have failed and the running result is not ok 28023 1726853645.31343: done checking to see if all hosts have failed 28023 1726853645.31343: getting the remaining hosts for this loop 28023 1726853645.31344: done getting the remaining hosts for this loop 28023 1726853645.31347: getting the next task for host managed_node3 28023 1726853645.31351: done getting next task for host managed_node3 28023 1726853645.31354: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 28023 1726853645.31357: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853645.31359: getting variables 28023 1726853645.31360: in VariableManager get_vars() 28023 1726853645.31375: Calling all_inventory to load vars for managed_node3 28023 1726853645.31378: Calling groups_inventory to load vars for managed_node3 28023 1726853645.31380: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853645.31385: Calling all_plugins_play to load vars for managed_node3 28023 1726853645.31388: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853645.31390: Calling groups_plugins_play to load vars for managed_node3 28023 1726853645.32588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853645.34250: done with get_vars() 28023 1726853645.34286: done getting variables 28023 1726853645.34337: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:34:05 -0400 (0:00:00.094) 0:00:37.427 ****** 28023 1726853645.34378: entering _queue_task() for managed_node3/set_fact 28023 1726853645.34905: worker is 1 (out of 1 available) 28023 1726853645.34916: exiting _queue_task() for managed_node3/set_fact 28023 1726853645.34926: done queuing things up, now waiting for results queue to drain 28023 1726853645.34927: waiting for pending results... 28023 1726853645.35290: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 28023 1726853645.35295: in run() - task 02083763-bbaf-fdb6-dad7-000000000b3c 28023 1726853645.35298: variable 'ansible_search_path' from source: unknown 28023 1726853645.35301: variable 'ansible_search_path' from source: unknown 28023 1726853645.35303: calling self._execute() 28023 1726853645.35398: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853645.35402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853645.35411: variable 'omit' from source: magic vars 28023 1726853645.35795: variable 'ansible_distribution_major_version' from source: facts 28023 1726853645.35818: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853645.35824: variable 'omit' from source: magic vars 28023 1726853645.35881: variable 'omit' from source: magic vars 28023 1726853645.35925: variable 'omit' from source: magic vars 28023 1726853645.35966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853645.36008: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853645.36037: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853645.36176: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853645.36180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853645.36183: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853645.36185: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853645.36187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853645.36203: Set connection var ansible_shell_type to sh 28023 1726853645.36211: Set connection var ansible_shell_executable to /bin/sh 28023 1726853645.36217: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853645.36223: Set connection var ansible_connection to ssh 28023 1726853645.36228: Set connection var ansible_pipelining to False 28023 1726853645.36233: Set connection var ansible_timeout to 10 28023 1726853645.36272: variable 'ansible_shell_executable' from source: unknown 28023 1726853645.36276: variable 'ansible_connection' from source: unknown 28023 1726853645.36280: variable 'ansible_module_compression' from source: unknown 28023 1726853645.36282: variable 'ansible_shell_type' from source: unknown 28023 1726853645.36285: variable 'ansible_shell_executable' from source: unknown 28023 1726853645.36287: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853645.36289: variable 'ansible_pipelining' from source: unknown 28023 1726853645.36291: variable 'ansible_timeout' from source: unknown 28023 1726853645.36296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853645.36439: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853645.36449: variable 'omit' from source: magic vars 28023 1726853645.36465: starting attempt loop 28023 1726853645.36468: running the handler 28023 1726853645.36482: handler run complete 28023 1726853645.36493: attempt loop complete, returning result 28023 1726853645.36496: _execute() done 28023 1726853645.36499: dumping result to json 28023 1726853645.36501: done dumping result, returning 28023 1726853645.36510: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-fdb6-dad7-000000000b3c] 28023 1726853645.36676: sending task result for task 02083763-bbaf-fdb6-dad7-000000000b3c 28023 1726853645.36746: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000b3c 28023 1726853645.36750: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 28023 1726853645.36809: no more pending results, returning what we have 28023 1726853645.36813: results queue empty 28023 1726853645.36813: checking for any_errors_fatal 28023 1726853645.36815: done checking for any_errors_fatal 28023 1726853645.36816: checking for max_fail_percentage 28023 1726853645.36817: done checking for max_fail_percentage 28023 1726853645.36818: checking to see if all hosts have failed and the running result is not ok 28023 1726853645.36819: done checking to see if all hosts have failed 28023 1726853645.36820: getting the remaining hosts for this loop 28023 1726853645.36822: done getting the remaining hosts for this loop 28023 1726853645.36826: getting the next task for host managed_node3 28023 1726853645.36833: done getting next task for host managed_node3 28023 1726853645.36836: ^ task is: TASK: Stat profile file 28023 1726853645.36841: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853645.36844: getting variables 28023 1726853645.36846: in VariableManager get_vars() 28023 1726853645.37004: Calling all_inventory to load vars for managed_node3 28023 1726853645.37008: Calling groups_inventory to load vars for managed_node3 28023 1726853645.37011: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853645.37020: Calling all_plugins_play to load vars for managed_node3 28023 1726853645.37023: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853645.37026: Calling groups_plugins_play to load vars for managed_node3 28023 1726853645.38587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853645.40267: done with get_vars() 28023 1726853645.40291: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:34:05 -0400 (0:00:00.060) 0:00:37.487 ****** 28023 1726853645.40392: entering _queue_task() for managed_node3/stat 28023 1726853645.40746: worker is 1 (out of 1 available) 28023 1726853645.40757: exiting _queue_task() for managed_node3/stat 28023 1726853645.40769: done queuing things up, now waiting for results queue to drain 28023 1726853645.40878: waiting for pending results... 28023 1726853645.41079: running TaskExecutor() for managed_node3/TASK: Stat profile file 28023 1726853645.41196: in run() - task 02083763-bbaf-fdb6-dad7-000000000b3d 28023 1726853645.41221: variable 'ansible_search_path' from source: unknown 28023 1726853645.41225: variable 'ansible_search_path' from source: unknown 28023 1726853645.41264: calling self._execute() 28023 1726853645.41376: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853645.41385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853645.41394: variable 'omit' from source: magic vars 28023 1726853645.41786: variable 'ansible_distribution_major_version' from source: facts 28023 1726853645.41798: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853645.41975: variable 'omit' from source: magic vars 28023 1726853645.41979: variable 'omit' from source: magic vars 28023 1726853645.41981: variable 'profile' from source: include params 28023 1726853645.41984: variable 'item' from source: include params 28023 1726853645.42034: variable 'item' from source: include params 28023 1726853645.42054: variable 'omit' from source: magic vars 28023 1726853645.42104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853645.42139: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853645.42162: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853645.42178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853645.42199: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853645.42228: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853645.42232: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853645.42235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853645.42340: Set connection var ansible_shell_type to sh 28023 1726853645.42349: Set connection var ansible_shell_executable to /bin/sh 28023 1726853645.42355: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853645.42362: Set connection var ansible_connection to ssh 28023 1726853645.42367: Set connection var ansible_pipelining to False 28023 1726853645.42373: Set connection var ansible_timeout to 10 28023 1726853645.42399: variable 'ansible_shell_executable' from source: unknown 28023 1726853645.42411: variable 'ansible_connection' from source: unknown 28023 1726853645.42414: variable 'ansible_module_compression' from source: unknown 28023 1726853645.42416: variable 'ansible_shell_type' from source: unknown 28023 1726853645.42418: variable 'ansible_shell_executable' from source: unknown 28023 1726853645.42421: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853645.42425: variable 'ansible_pipelining' from source: unknown 28023 1726853645.42428: variable 'ansible_timeout' from source: unknown 28023 1726853645.42432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853645.42633: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28023 1726853645.42644: variable 'omit' from source: magic vars 28023 1726853645.42650: starting attempt loop 28023 1726853645.42654: running the handler 28023 1726853645.42667: _low_level_execute_command(): starting 28023 1726853645.42675: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853645.43476: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853645.43480: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853645.43482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853645.43485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853645.43488: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853645.43490: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853645.43492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853645.43494: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28023 1726853645.43496: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 28023 1726853645.43501: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28023 1726853645.43535: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853645.43599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853645.43624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853645.43729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853645.45432: stdout chunk (state=3): >>>/root <<< 28023 1726853645.45587: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853645.45591: stdout chunk (state=3): >>><<< 28023 1726853645.45593: stderr chunk (state=3): >>><<< 28023 1726853645.45721: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853645.45726: _low_level_execute_command(): starting 28023 1726853645.45730: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853645.4562569-29691-115167955765451 `" && echo ansible-tmp-1726853645.4562569-29691-115167955765451="` echo /root/.ansible/tmp/ansible-tmp-1726853645.4562569-29691-115167955765451 `" ) && sleep 0' 28023 1726853645.46318: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853645.46322: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28023 1726853645.46386: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 28023 1726853645.46390: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853645.46447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853645.46463: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853645.46488: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853645.46608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853645.48569: stdout chunk (state=3): >>>ansible-tmp-1726853645.4562569-29691-115167955765451=/root/.ansible/tmp/ansible-tmp-1726853645.4562569-29691-115167955765451 <<< 28023 1726853645.48738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853645.48742: stdout chunk (state=3): >>><<< 28023 1726853645.48744: stderr chunk (state=3): >>><<< 28023 1726853645.48767: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853645.4562569-29691-115167955765451=/root/.ansible/tmp/ansible-tmp-1726853645.4562569-29691-115167955765451 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853645.48823: variable 'ansible_module_compression' from source: unknown 28023 1726853645.49076: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28023 1726853645.49079: variable 'ansible_facts' from source: unknown 28023 1726853645.49082: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853645.4562569-29691-115167955765451/AnsiballZ_stat.py 28023 1726853645.49219: Sending initial data 28023 1726853645.49295: Sent initial data (153 bytes) 28023 1726853645.49894: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853645.49940: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853645.50033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853645.51685: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28023 1726853645.51690: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853645.51744: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853645.51801: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpufbqwlh5 /root/.ansible/tmp/ansible-tmp-1726853645.4562569-29691-115167955765451/AnsiballZ_stat.py <<< 28023 1726853645.51804: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853645.4562569-29691-115167955765451/AnsiballZ_stat.py" <<< 28023 1726853645.51860: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 28023 1726853645.51866: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpufbqwlh5" to remote "/root/.ansible/tmp/ansible-tmp-1726853645.4562569-29691-115167955765451/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853645.4562569-29691-115167955765451/AnsiballZ_stat.py" <<< 28023 1726853645.52587: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853645.52735: stderr chunk (state=3): >>><<< 28023 1726853645.52738: stdout chunk (state=3): >>><<< 28023 1726853645.52740: done transferring module to remote 28023 1726853645.52742: _low_level_execute_command(): starting 28023 1726853645.52745: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853645.4562569-29691-115167955765451/ /root/.ansible/tmp/ansible-tmp-1726853645.4562569-29691-115167955765451/AnsiballZ_stat.py && sleep 0' 28023 1726853645.53275: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853645.53300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853645.53391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853645.55248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853645.55277: stderr chunk (state=3): >>><<< 28023 1726853645.55280: stdout chunk (state=3): >>><<< 28023 1726853645.55290: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853645.55293: _low_level_execute_command(): starting 28023 1726853645.55298: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853645.4562569-29691-115167955765451/AnsiballZ_stat.py && sleep 0' 28023 1726853645.55724: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853645.55728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853645.55730: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853645.55732: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853645.55734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853645.55776: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853645.55791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853645.55862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853645.71565: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28023 1726853645.73101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853645.73105: stdout chunk (state=3): >>><<< 28023 1726853645.73108: stderr chunk (state=3): >>><<< 28023 1726853645.73137: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853645.73267: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853645.4562569-29691-115167955765451/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853645.73274: _low_level_execute_command(): starting 28023 1726853645.73277: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853645.4562569-29691-115167955765451/ > /dev/null 2>&1 && sleep 0' 28023 1726853645.73913: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853645.73927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853645.73963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853645.74070: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853645.74116: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853645.74187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853645.76092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853645.76276: stderr chunk (state=3): >>><<< 28023 1726853645.76279: stdout chunk (state=3): >>><<< 28023 1726853645.76281: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853645.76283: handler run complete 28023 1726853645.76285: attempt loop complete, returning result 28023 1726853645.76286: _execute() done 28023 1726853645.76288: dumping result to json 28023 1726853645.76289: done dumping result, returning 28023 1726853645.76291: done running TaskExecutor() for managed_node3/TASK: Stat profile file [02083763-bbaf-fdb6-dad7-000000000b3d] 28023 1726853645.76293: sending task result for task 02083763-bbaf-fdb6-dad7-000000000b3d 28023 1726853645.76360: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000b3d 28023 1726853645.76363: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 28023 1726853645.76431: no more pending results, returning what we have 28023 1726853645.76434: results queue empty 28023 1726853645.76435: checking for any_errors_fatal 28023 1726853645.76442: done checking for any_errors_fatal 28023 1726853645.76443: checking for max_fail_percentage 28023 1726853645.76445: done checking for max_fail_percentage 28023 1726853645.76446: checking to see if all hosts have failed and the running result is not ok 28023 1726853645.76447: done checking to see if all hosts have failed 28023 1726853645.76447: getting the remaining hosts for this loop 28023 1726853645.76450: done getting the remaining hosts for this loop 28023 1726853645.76453: getting the next task for host managed_node3 28023 1726853645.76462: done getting next task for host managed_node3 28023 1726853645.76465: ^ task is: TASK: Set NM profile exist flag based on the profile files 28023 1726853645.76470: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853645.76476: getting variables 28023 1726853645.76477: in VariableManager get_vars() 28023 1726853645.76520: Calling all_inventory to load vars for managed_node3 28023 1726853645.76523: Calling groups_inventory to load vars for managed_node3 28023 1726853645.76525: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853645.76538: Calling all_plugins_play to load vars for managed_node3 28023 1726853645.76542: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853645.76544: Calling groups_plugins_play to load vars for managed_node3 28023 1726853645.78165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853645.79741: done with get_vars() 28023 1726853645.79766: done getting variables 28023 1726853645.79828: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:34:05 -0400 (0:00:00.394) 0:00:37.882 ****** 28023 1726853645.79869: entering _queue_task() for managed_node3/set_fact 28023 1726853645.80216: worker is 1 (out of 1 available) 28023 1726853645.80227: exiting _queue_task() for managed_node3/set_fact 28023 1726853645.80240: done queuing things up, now waiting for results queue to drain 28023 1726853645.80241: waiting for pending results... 28023 1726853645.80540: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 28023 1726853645.80668: in run() - task 02083763-bbaf-fdb6-dad7-000000000b3e 28023 1726853645.80682: variable 'ansible_search_path' from source: unknown 28023 1726853645.80686: variable 'ansible_search_path' from source: unknown 28023 1726853645.80877: calling self._execute() 28023 1726853645.80881: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853645.80883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853645.80887: variable 'omit' from source: magic vars 28023 1726853645.81248: variable 'ansible_distribution_major_version' from source: facts 28023 1726853645.81262: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853645.81388: variable 'profile_stat' from source: set_fact 28023 1726853645.81398: Evaluated conditional (profile_stat.stat.exists): False 28023 1726853645.81402: when evaluation is False, skipping this task 28023 1726853645.81405: _execute() done 28023 1726853645.81408: dumping result to json 28023 1726853645.81410: done dumping result, returning 28023 1726853645.81418: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-fdb6-dad7-000000000b3e] 28023 1726853645.81423: sending task result for task 02083763-bbaf-fdb6-dad7-000000000b3e 28023 1726853645.81518: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000b3e 28023 1726853645.81522: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28023 1726853645.81577: no more pending results, returning what we have 28023 1726853645.81581: results queue empty 28023 1726853645.81582: checking for any_errors_fatal 28023 1726853645.81592: done checking for any_errors_fatal 28023 1726853645.81593: checking for max_fail_percentage 28023 1726853645.81595: done checking for max_fail_percentage 28023 1726853645.81596: checking to see if all hosts have failed and the running result is not ok 28023 1726853645.81597: done checking to see if all hosts have failed 28023 1726853645.81598: getting the remaining hosts for this loop 28023 1726853645.81599: done getting the remaining hosts for this loop 28023 1726853645.81603: getting the next task for host managed_node3 28023 1726853645.81612: done getting next task for host managed_node3 28023 1726853645.81615: ^ task is: TASK: Get NM profile info 28023 1726853645.81621: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853645.81625: getting variables 28023 1726853645.81627: in VariableManager get_vars() 28023 1726853645.81782: Calling all_inventory to load vars for managed_node3 28023 1726853645.81785: Calling groups_inventory to load vars for managed_node3 28023 1726853645.81788: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853645.81801: Calling all_plugins_play to load vars for managed_node3 28023 1726853645.81804: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853645.81807: Calling groups_plugins_play to load vars for managed_node3 28023 1726853645.83450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853645.85584: done with get_vars() 28023 1726853645.85614: done getting variables 28023 1726853645.85925: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:34:05 -0400 (0:00:00.060) 0:00:37.943 ****** 28023 1726853645.85963: entering _queue_task() for managed_node3/shell 28023 1726853645.85965: Creating lock for shell 28023 1726853645.86491: worker is 1 (out of 1 available) 28023 1726853645.86503: exiting _queue_task() for managed_node3/shell 28023 1726853645.86514: done queuing things up, now waiting for results queue to drain 28023 1726853645.86515: waiting for pending results... 28023 1726853645.86891: running TaskExecutor() for managed_node3/TASK: Get NM profile info 28023 1726853645.86901: in run() - task 02083763-bbaf-fdb6-dad7-000000000b3f 28023 1726853645.86905: variable 'ansible_search_path' from source: unknown 28023 1726853645.86908: variable 'ansible_search_path' from source: unknown 28023 1726853645.86943: calling self._execute() 28023 1726853645.87274: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853645.87278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853645.87282: variable 'omit' from source: magic vars 28023 1726853645.87467: variable 'ansible_distribution_major_version' from source: facts 28023 1726853645.87480: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853645.87486: variable 'omit' from source: magic vars 28023 1726853645.87540: variable 'omit' from source: magic vars 28023 1726853645.87660: variable 'profile' from source: include params 28023 1726853645.87668: variable 'item' from source: include params 28023 1726853645.87737: variable 'item' from source: include params 28023 1726853645.87757: variable 'omit' from source: magic vars 28023 1726853645.87810: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853645.87846: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853645.87878: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853645.87896: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853645.87907: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853645.87936: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853645.87939: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853645.87942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853645.88048: Set connection var ansible_shell_type to sh 28023 1726853645.88081: Set connection var ansible_shell_executable to /bin/sh 28023 1726853645.88086: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853645.88088: Set connection var ansible_connection to ssh 28023 1726853645.88181: Set connection var ansible_pipelining to False 28023 1726853645.88189: Set connection var ansible_timeout to 10 28023 1726853645.88222: variable 'ansible_shell_executable' from source: unknown 28023 1726853645.88226: variable 'ansible_connection' from source: unknown 28023 1726853645.88228: variable 'ansible_module_compression' from source: unknown 28023 1726853645.88231: variable 'ansible_shell_type' from source: unknown 28023 1726853645.88234: variable 'ansible_shell_executable' from source: unknown 28023 1726853645.88237: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853645.88239: variable 'ansible_pipelining' from source: unknown 28023 1726853645.88242: variable 'ansible_timeout' from source: unknown 28023 1726853645.88246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853645.88574: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853645.88649: variable 'omit' from source: magic vars 28023 1726853645.88652: starting attempt loop 28023 1726853645.88655: running the handler 28023 1726853645.88669: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853645.88691: _low_level_execute_command(): starting 28023 1726853645.88698: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853645.90124: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853645.90131: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853645.90346: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853645.90576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853645.92153: stdout chunk (state=3): >>>/root <<< 28023 1726853645.92284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853645.92288: stderr chunk (state=3): >>><<< 28023 1726853645.92293: stdout chunk (state=3): >>><<< 28023 1726853645.92397: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853645.92411: _low_level_execute_command(): starting 28023 1726853645.92419: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853645.923972-29710-89413888527910 `" && echo ansible-tmp-1726853645.923972-29710-89413888527910="` echo /root/.ansible/tmp/ansible-tmp-1726853645.923972-29710-89413888527910 `" ) && sleep 0' 28023 1726853645.93735: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853645.93877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853645.93890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28023 1726853645.93894: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853645.93897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853645.93975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853645.93992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853645.94009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853645.94187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853645.96184: stdout chunk (state=3): >>>ansible-tmp-1726853645.923972-29710-89413888527910=/root/.ansible/tmp/ansible-tmp-1726853645.923972-29710-89413888527910 <<< 28023 1726853645.96376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853645.96387: stdout chunk (state=3): >>><<< 28023 1726853645.96415: stderr chunk (state=3): >>><<< 28023 1726853645.96583: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853645.923972-29710-89413888527910=/root/.ansible/tmp/ansible-tmp-1726853645.923972-29710-89413888527910 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853645.96586: variable 'ansible_module_compression' from source: unknown 28023 1726853645.96588: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28023 1726853645.96605: variable 'ansible_facts' from source: unknown 28023 1726853645.96699: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853645.923972-29710-89413888527910/AnsiballZ_command.py 28023 1726853645.96926: Sending initial data 28023 1726853645.96935: Sent initial data (154 bytes) 28023 1726853645.97852: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853645.97888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853645.97966: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853645.98014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853645.98074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853645.99780: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853645.99917: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853646.00045: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpttkmdoec /root/.ansible/tmp/ansible-tmp-1726853645.923972-29710-89413888527910/AnsiballZ_command.py <<< 28023 1726853646.00049: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853645.923972-29710-89413888527910/AnsiballZ_command.py" <<< 28023 1726853646.00145: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpttkmdoec" to remote "/root/.ansible/tmp/ansible-tmp-1726853645.923972-29710-89413888527910/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853645.923972-29710-89413888527910/AnsiballZ_command.py" <<< 28023 1726853646.01485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853646.01489: stdout chunk (state=3): >>><<< 28023 1726853646.01492: stderr chunk (state=3): >>><<< 28023 1726853646.01674: done transferring module to remote 28023 1726853646.01678: _low_level_execute_command(): starting 28023 1726853646.01685: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853645.923972-29710-89413888527910/ /root/.ansible/tmp/ansible-tmp-1726853645.923972-29710-89413888527910/AnsiballZ_command.py && sleep 0' 28023 1726853646.02359: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853646.02400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853646.02422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853646.02436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853646.02531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853646.04483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853646.04699: stdout chunk (state=3): >>><<< 28023 1726853646.04708: stderr chunk (state=3): >>><<< 28023 1726853646.04711: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853646.04714: _low_level_execute_command(): starting 28023 1726853646.04717: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853645.923972-29710-89413888527910/AnsiballZ_command.py && sleep 0' 28023 1726853646.05395: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853646.05411: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853646.05424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853646.05498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853646.05542: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853646.05565: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853646.05593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853646.05702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853646.23099: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-20 13:34:06.212268", "end": "2024-09-20 13:34:06.229649", "delta": "0:00:00.017381", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28023 1726853646.24695: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.217 closed. <<< 28023 1726853646.24723: stderr chunk (state=3): >>><<< 28023 1726853646.24726: stdout chunk (state=3): >>><<< 28023 1726853646.24742: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-20 13:34:06.212268", "end": "2024-09-20 13:34:06.229649", "delta": "0:00:00.017381", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.217 closed. 28023 1726853646.24773: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853645.923972-29710-89413888527910/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853646.24781: _low_level_execute_command(): starting 28023 1726853646.24786: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853645.923972-29710-89413888527910/ > /dev/null 2>&1 && sleep 0' 28023 1726853646.25237: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853646.25240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853646.25247: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 28023 1726853646.25249: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853646.25251: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853646.25297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853646.25300: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853646.25304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853646.25362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853646.27219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853646.27240: stderr chunk (state=3): >>><<< 28023 1726853646.27243: stdout chunk (state=3): >>><<< 28023 1726853646.27256: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853646.27265: handler run complete 28023 1726853646.27288: Evaluated conditional (False): False 28023 1726853646.27296: attempt loop complete, returning result 28023 1726853646.27299: _execute() done 28023 1726853646.27302: dumping result to json 28023 1726853646.27307: done dumping result, returning 28023 1726853646.27314: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [02083763-bbaf-fdb6-dad7-000000000b3f] 28023 1726853646.27318: sending task result for task 02083763-bbaf-fdb6-dad7-000000000b3f 28023 1726853646.27412: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000b3f 28023 1726853646.27415: WORKER PROCESS EXITING fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "delta": "0:00:00.017381", "end": "2024-09-20 13:34:06.229649", "rc": 1, "start": "2024-09-20 13:34:06.212268" } MSG: non-zero return code ...ignoring 28023 1726853646.27486: no more pending results, returning what we have 28023 1726853646.27489: results queue empty 28023 1726853646.27490: checking for any_errors_fatal 28023 1726853646.27496: done checking for any_errors_fatal 28023 1726853646.27496: checking for max_fail_percentage 28023 1726853646.27498: done checking for max_fail_percentage 28023 1726853646.27499: checking to see if all hosts have failed and the running result is not ok 28023 1726853646.27500: done checking to see if all hosts have failed 28023 1726853646.27500: getting the remaining hosts for this loop 28023 1726853646.27502: done getting the remaining hosts for this loop 28023 1726853646.27505: getting the next task for host managed_node3 28023 1726853646.27512: done getting next task for host managed_node3 28023 1726853646.27515: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 28023 1726853646.27520: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853646.27526: getting variables 28023 1726853646.27527: in VariableManager get_vars() 28023 1726853646.27568: Calling all_inventory to load vars for managed_node3 28023 1726853646.27577: Calling groups_inventory to load vars for managed_node3 28023 1726853646.27580: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853646.27590: Calling all_plugins_play to load vars for managed_node3 28023 1726853646.27593: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853646.27595: Calling groups_plugins_play to load vars for managed_node3 28023 1726853646.28418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853646.30329: done with get_vars() 28023 1726853646.30364: done getting variables 28023 1726853646.30431: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:34:06 -0400 (0:00:00.445) 0:00:38.388 ****** 28023 1726853646.30477: entering _queue_task() for managed_node3/set_fact 28023 1726853646.30831: worker is 1 (out of 1 available) 28023 1726853646.30844: exiting _queue_task() for managed_node3/set_fact 28023 1726853646.30856: done queuing things up, now waiting for results queue to drain 28023 1726853646.30860: waiting for pending results... 28023 1726853646.31395: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 28023 1726853646.31401: in run() - task 02083763-bbaf-fdb6-dad7-000000000b40 28023 1726853646.31406: variable 'ansible_search_path' from source: unknown 28023 1726853646.31409: variable 'ansible_search_path' from source: unknown 28023 1726853646.31416: calling self._execute() 28023 1726853646.31486: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853646.31500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853646.31510: variable 'omit' from source: magic vars 28023 1726853646.31923: variable 'ansible_distribution_major_version' from source: facts 28023 1726853646.31941: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853646.32092: variable 'nm_profile_exists' from source: set_fact 28023 1726853646.32105: Evaluated conditional (nm_profile_exists.rc == 0): False 28023 1726853646.32109: when evaluation is False, skipping this task 28023 1726853646.32112: _execute() done 28023 1726853646.32115: dumping result to json 28023 1726853646.32118: done dumping result, returning 28023 1726853646.32125: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-fdb6-dad7-000000000b40] 28023 1726853646.32130: sending task result for task 02083763-bbaf-fdb6-dad7-000000000b40 28023 1726853646.32235: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000b40 28023 1726853646.32239: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 28023 1726853646.32301: no more pending results, returning what we have 28023 1726853646.32305: results queue empty 28023 1726853646.32306: checking for any_errors_fatal 28023 1726853646.32317: done checking for any_errors_fatal 28023 1726853646.32318: checking for max_fail_percentage 28023 1726853646.32319: done checking for max_fail_percentage 28023 1726853646.32320: checking to see if all hosts have failed and the running result is not ok 28023 1726853646.32321: done checking to see if all hosts have failed 28023 1726853646.32322: getting the remaining hosts for this loop 28023 1726853646.32324: done getting the remaining hosts for this loop 28023 1726853646.32327: getting the next task for host managed_node3 28023 1726853646.32338: done getting next task for host managed_node3 28023 1726853646.32340: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 28023 1726853646.32346: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853646.32350: getting variables 28023 1726853646.32352: in VariableManager get_vars() 28023 1726853646.32516: Calling all_inventory to load vars for managed_node3 28023 1726853646.32519: Calling groups_inventory to load vars for managed_node3 28023 1726853646.32522: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853646.32536: Calling all_plugins_play to load vars for managed_node3 28023 1726853646.32539: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853646.32542: Calling groups_plugins_play to load vars for managed_node3 28023 1726853646.34165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853646.35892: done with get_vars() 28023 1726853646.35924: done getting variables 28023 1726853646.36000: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853646.36130: variable 'profile' from source: include params 28023 1726853646.36135: variable 'item' from source: include params 28023 1726853646.36209: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-ethtest0] *********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:34:06 -0400 (0:00:00.057) 0:00:38.446 ****** 28023 1726853646.36244: entering _queue_task() for managed_node3/command 28023 1726853646.36639: worker is 1 (out of 1 available) 28023 1726853646.36651: exiting _queue_task() for managed_node3/command 28023 1726853646.36666: done queuing things up, now waiting for results queue to drain 28023 1726853646.36667: waiting for pending results... 28023 1726853646.37190: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-ethtest0 28023 1726853646.37196: in run() - task 02083763-bbaf-fdb6-dad7-000000000b42 28023 1726853646.37201: variable 'ansible_search_path' from source: unknown 28023 1726853646.37205: variable 'ansible_search_path' from source: unknown 28023 1726853646.37208: calling self._execute() 28023 1726853646.37306: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853646.37309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853646.37321: variable 'omit' from source: magic vars 28023 1726853646.37720: variable 'ansible_distribution_major_version' from source: facts 28023 1726853646.37732: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853646.37867: variable 'profile_stat' from source: set_fact 28023 1726853646.37880: Evaluated conditional (profile_stat.stat.exists): False 28023 1726853646.37884: when evaluation is False, skipping this task 28023 1726853646.37887: _execute() done 28023 1726853646.37889: dumping result to json 28023 1726853646.37892: done dumping result, returning 28023 1726853646.37900: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-ethtest0 [02083763-bbaf-fdb6-dad7-000000000b42] 28023 1726853646.37905: sending task result for task 02083763-bbaf-fdb6-dad7-000000000b42 28023 1726853646.38003: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000b42 28023 1726853646.38005: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28023 1726853646.38074: no more pending results, returning what we have 28023 1726853646.38079: results queue empty 28023 1726853646.38080: checking for any_errors_fatal 28023 1726853646.38091: done checking for any_errors_fatal 28023 1726853646.38092: checking for max_fail_percentage 28023 1726853646.38093: done checking for max_fail_percentage 28023 1726853646.38094: checking to see if all hosts have failed and the running result is not ok 28023 1726853646.38096: done checking to see if all hosts have failed 28023 1726853646.38096: getting the remaining hosts for this loop 28023 1726853646.38098: done getting the remaining hosts for this loop 28023 1726853646.38102: getting the next task for host managed_node3 28023 1726853646.38112: done getting next task for host managed_node3 28023 1726853646.38114: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 28023 1726853646.38119: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853646.38126: getting variables 28023 1726853646.38128: in VariableManager get_vars() 28023 1726853646.38392: Calling all_inventory to load vars for managed_node3 28023 1726853646.38395: Calling groups_inventory to load vars for managed_node3 28023 1726853646.38398: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853646.38409: Calling all_plugins_play to load vars for managed_node3 28023 1726853646.38412: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853646.38415: Calling groups_plugins_play to load vars for managed_node3 28023 1726853646.40098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853646.41799: done with get_vars() 28023 1726853646.41828: done getting variables 28023 1726853646.41902: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853646.42021: variable 'profile' from source: include params 28023 1726853646.42026: variable 'item' from source: include params 28023 1726853646.42093: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-ethtest0] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:34:06 -0400 (0:00:00.058) 0:00:38.505 ****** 28023 1726853646.42128: entering _queue_task() for managed_node3/set_fact 28023 1726853646.42509: worker is 1 (out of 1 available) 28023 1726853646.42637: exiting _queue_task() for managed_node3/set_fact 28023 1726853646.42650: done queuing things up, now waiting for results queue to drain 28023 1726853646.42651: waiting for pending results... 28023 1726853646.42870: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 28023 1726853646.43079: in run() - task 02083763-bbaf-fdb6-dad7-000000000b43 28023 1726853646.43083: variable 'ansible_search_path' from source: unknown 28023 1726853646.43086: variable 'ansible_search_path' from source: unknown 28023 1726853646.43089: calling self._execute() 28023 1726853646.43214: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853646.43217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853646.43227: variable 'omit' from source: magic vars 28023 1726853646.43632: variable 'ansible_distribution_major_version' from source: facts 28023 1726853646.43649: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853646.43782: variable 'profile_stat' from source: set_fact 28023 1726853646.43975: Evaluated conditional (profile_stat.stat.exists): False 28023 1726853646.43978: when evaluation is False, skipping this task 28023 1726853646.43980: _execute() done 28023 1726853646.43982: dumping result to json 28023 1726853646.43983: done dumping result, returning 28023 1726853646.43985: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 [02083763-bbaf-fdb6-dad7-000000000b43] 28023 1726853646.43987: sending task result for task 02083763-bbaf-fdb6-dad7-000000000b43 28023 1726853646.44053: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000b43 28023 1726853646.44056: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28023 1726853646.44108: no more pending results, returning what we have 28023 1726853646.44111: results queue empty 28023 1726853646.44112: checking for any_errors_fatal 28023 1726853646.44120: done checking for any_errors_fatal 28023 1726853646.44120: checking for max_fail_percentage 28023 1726853646.44122: done checking for max_fail_percentage 28023 1726853646.44123: checking to see if all hosts have failed and the running result is not ok 28023 1726853646.44124: done checking to see if all hosts have failed 28023 1726853646.44125: getting the remaining hosts for this loop 28023 1726853646.44126: done getting the remaining hosts for this loop 28023 1726853646.44130: getting the next task for host managed_node3 28023 1726853646.44138: done getting next task for host managed_node3 28023 1726853646.44141: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 28023 1726853646.44147: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853646.44151: getting variables 28023 1726853646.44153: in VariableManager get_vars() 28023 1726853646.44205: Calling all_inventory to load vars for managed_node3 28023 1726853646.44208: Calling groups_inventory to load vars for managed_node3 28023 1726853646.44211: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853646.44224: Calling all_plugins_play to load vars for managed_node3 28023 1726853646.44227: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853646.44230: Calling groups_plugins_play to load vars for managed_node3 28023 1726853646.45809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853646.47500: done with get_vars() 28023 1726853646.47532: done getting variables 28023 1726853646.47593: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853646.47708: variable 'profile' from source: include params 28023 1726853646.47712: variable 'item' from source: include params 28023 1726853646.47780: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-ethtest0] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:34:06 -0400 (0:00:00.056) 0:00:38.562 ****** 28023 1726853646.47812: entering _queue_task() for managed_node3/command 28023 1726853646.48292: worker is 1 (out of 1 available) 28023 1726853646.48304: exiting _queue_task() for managed_node3/command 28023 1726853646.48314: done queuing things up, now waiting for results queue to drain 28023 1726853646.48315: waiting for pending results... 28023 1726853646.48530: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-ethtest0 28023 1726853646.48669: in run() - task 02083763-bbaf-fdb6-dad7-000000000b44 28023 1726853646.48684: variable 'ansible_search_path' from source: unknown 28023 1726853646.48688: variable 'ansible_search_path' from source: unknown 28023 1726853646.48733: calling self._execute() 28023 1726853646.48854: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853646.48861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853646.48879: variable 'omit' from source: magic vars 28023 1726853646.49319: variable 'ansible_distribution_major_version' from source: facts 28023 1726853646.49323: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853646.49437: variable 'profile_stat' from source: set_fact 28023 1726853646.49446: Evaluated conditional (profile_stat.stat.exists): False 28023 1726853646.49449: when evaluation is False, skipping this task 28023 1726853646.49452: _execute() done 28023 1726853646.49455: dumping result to json 28023 1726853646.49458: done dumping result, returning 28023 1726853646.49467: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-ethtest0 [02083763-bbaf-fdb6-dad7-000000000b44] 28023 1726853646.49473: sending task result for task 02083763-bbaf-fdb6-dad7-000000000b44 28023 1726853646.49563: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000b44 28023 1726853646.49566: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28023 1726853646.49634: no more pending results, returning what we have 28023 1726853646.49638: results queue empty 28023 1726853646.49639: checking for any_errors_fatal 28023 1726853646.49647: done checking for any_errors_fatal 28023 1726853646.49647: checking for max_fail_percentage 28023 1726853646.49649: done checking for max_fail_percentage 28023 1726853646.49650: checking to see if all hosts have failed and the running result is not ok 28023 1726853646.49651: done checking to see if all hosts have failed 28023 1726853646.49652: getting the remaining hosts for this loop 28023 1726853646.49653: done getting the remaining hosts for this loop 28023 1726853646.49656: getting the next task for host managed_node3 28023 1726853646.49665: done getting next task for host managed_node3 28023 1726853646.49668: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 28023 1726853646.49674: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853646.49679: getting variables 28023 1726853646.49680: in VariableManager get_vars() 28023 1726853646.49720: Calling all_inventory to load vars for managed_node3 28023 1726853646.49722: Calling groups_inventory to load vars for managed_node3 28023 1726853646.49724: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853646.49735: Calling all_plugins_play to load vars for managed_node3 28023 1726853646.49738: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853646.49740: Calling groups_plugins_play to load vars for managed_node3 28023 1726853646.50656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853646.51874: done with get_vars() 28023 1726853646.51899: done getting variables 28023 1726853646.51956: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853646.52054: variable 'profile' from source: include params 28023 1726853646.52057: variable 'item' from source: include params 28023 1726853646.52112: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-ethtest0] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:34:06 -0400 (0:00:00.043) 0:00:38.605 ****** 28023 1726853646.52147: entering _queue_task() for managed_node3/set_fact 28023 1726853646.52486: worker is 1 (out of 1 available) 28023 1726853646.52499: exiting _queue_task() for managed_node3/set_fact 28023 1726853646.52511: done queuing things up, now waiting for results queue to drain 28023 1726853646.52513: waiting for pending results... 28023 1726853646.52892: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-ethtest0 28023 1726853646.52943: in run() - task 02083763-bbaf-fdb6-dad7-000000000b45 28023 1726853646.52964: variable 'ansible_search_path' from source: unknown 28023 1726853646.52974: variable 'ansible_search_path' from source: unknown 28023 1726853646.53023: calling self._execute() 28023 1726853646.53139: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853646.53152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853646.53168: variable 'omit' from source: magic vars 28023 1726853646.53522: variable 'ansible_distribution_major_version' from source: facts 28023 1726853646.53538: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853646.53621: variable 'profile_stat' from source: set_fact 28023 1726853646.53630: Evaluated conditional (profile_stat.stat.exists): False 28023 1726853646.53635: when evaluation is False, skipping this task 28023 1726853646.53637: _execute() done 28023 1726853646.53640: dumping result to json 28023 1726853646.53642: done dumping result, returning 28023 1726853646.53649: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-ethtest0 [02083763-bbaf-fdb6-dad7-000000000b45] 28023 1726853646.53654: sending task result for task 02083763-bbaf-fdb6-dad7-000000000b45 28023 1726853646.53738: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000b45 28023 1726853646.53740: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28023 1726853646.53800: no more pending results, returning what we have 28023 1726853646.53803: results queue empty 28023 1726853646.53804: checking for any_errors_fatal 28023 1726853646.53814: done checking for any_errors_fatal 28023 1726853646.53814: checking for max_fail_percentage 28023 1726853646.53816: done checking for max_fail_percentage 28023 1726853646.53817: checking to see if all hosts have failed and the running result is not ok 28023 1726853646.53817: done checking to see if all hosts have failed 28023 1726853646.53818: getting the remaining hosts for this loop 28023 1726853646.53820: done getting the remaining hosts for this loop 28023 1726853646.53823: getting the next task for host managed_node3 28023 1726853646.53833: done getting next task for host managed_node3 28023 1726853646.53836: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 28023 1726853646.53841: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853646.53846: getting variables 28023 1726853646.53847: in VariableManager get_vars() 28023 1726853646.53893: Calling all_inventory to load vars for managed_node3 28023 1726853646.53895: Calling groups_inventory to load vars for managed_node3 28023 1726853646.53897: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853646.53908: Calling all_plugins_play to load vars for managed_node3 28023 1726853646.53911: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853646.53913: Calling groups_plugins_play to load vars for managed_node3 28023 1726853646.54714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853646.55947: done with get_vars() 28023 1726853646.55974: done getting variables 28023 1726853646.56033: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853646.56150: variable 'profile' from source: include params 28023 1726853646.56153: variable 'item' from source: include params 28023 1726853646.56197: variable 'item' from source: include params TASK [Assert that the profile is absent - 'ethtest0'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 13:34:06 -0400 (0:00:00.040) 0:00:38.646 ****** 28023 1726853646.56219: entering _queue_task() for managed_node3/assert 28023 1726853646.56461: worker is 1 (out of 1 available) 28023 1726853646.56477: exiting _queue_task() for managed_node3/assert 28023 1726853646.56490: done queuing things up, now waiting for results queue to drain 28023 1726853646.56492: waiting for pending results... 28023 1726853646.56664: running TaskExecutor() for managed_node3/TASK: Assert that the profile is absent - 'ethtest0' 28023 1726853646.56742: in run() - task 02083763-bbaf-fdb6-dad7-000000000a6d 28023 1726853646.56752: variable 'ansible_search_path' from source: unknown 28023 1726853646.56756: variable 'ansible_search_path' from source: unknown 28023 1726853646.56787: calling self._execute() 28023 1726853646.56870: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853646.56877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853646.56885: variable 'omit' from source: magic vars 28023 1726853646.57149: variable 'ansible_distribution_major_version' from source: facts 28023 1726853646.57163: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853646.57166: variable 'omit' from source: magic vars 28023 1726853646.57196: variable 'omit' from source: magic vars 28023 1726853646.57263: variable 'profile' from source: include params 28023 1726853646.57267: variable 'item' from source: include params 28023 1726853646.57312: variable 'item' from source: include params 28023 1726853646.57328: variable 'omit' from source: magic vars 28023 1726853646.57363: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853646.57392: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853646.57408: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853646.57422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853646.57431: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853646.57454: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853646.57460: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853646.57463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853646.57532: Set connection var ansible_shell_type to sh 28023 1726853646.57539: Set connection var ansible_shell_executable to /bin/sh 28023 1726853646.57544: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853646.57550: Set connection var ansible_connection to ssh 28023 1726853646.57554: Set connection var ansible_pipelining to False 28023 1726853646.57561: Set connection var ansible_timeout to 10 28023 1726853646.57581: variable 'ansible_shell_executable' from source: unknown 28023 1726853646.57584: variable 'ansible_connection' from source: unknown 28023 1726853646.57586: variable 'ansible_module_compression' from source: unknown 28023 1726853646.57588: variable 'ansible_shell_type' from source: unknown 28023 1726853646.57591: variable 'ansible_shell_executable' from source: unknown 28023 1726853646.57593: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853646.57600: variable 'ansible_pipelining' from source: unknown 28023 1726853646.57603: variable 'ansible_timeout' from source: unknown 28023 1726853646.57605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853646.57707: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853646.57716: variable 'omit' from source: magic vars 28023 1726853646.57724: starting attempt loop 28023 1726853646.57726: running the handler 28023 1726853646.57807: variable 'lsr_net_profile_exists' from source: set_fact 28023 1726853646.57811: Evaluated conditional (not lsr_net_profile_exists): True 28023 1726853646.57818: handler run complete 28023 1726853646.57832: attempt loop complete, returning result 28023 1726853646.57836: _execute() done 28023 1726853646.57838: dumping result to json 28023 1726853646.57841: done dumping result, returning 28023 1726853646.57845: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is absent - 'ethtest0' [02083763-bbaf-fdb6-dad7-000000000a6d] 28023 1726853646.57850: sending task result for task 02083763-bbaf-fdb6-dad7-000000000a6d 28023 1726853646.57931: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000a6d 28023 1726853646.57935: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 28023 1726853646.57988: no more pending results, returning what we have 28023 1726853646.57991: results queue empty 28023 1726853646.57992: checking for any_errors_fatal 28023 1726853646.57999: done checking for any_errors_fatal 28023 1726853646.58000: checking for max_fail_percentage 28023 1726853646.58001: done checking for max_fail_percentage 28023 1726853646.58002: checking to see if all hosts have failed and the running result is not ok 28023 1726853646.58003: done checking to see if all hosts have failed 28023 1726853646.58004: getting the remaining hosts for this loop 28023 1726853646.58005: done getting the remaining hosts for this loop 28023 1726853646.58008: getting the next task for host managed_node3 28023 1726853646.58019: done getting next task for host managed_node3 28023 1726853646.58022: ^ task is: TASK: Include the task 'get_profile_stat.yml' 28023 1726853646.58026: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853646.58031: getting variables 28023 1726853646.58032: in VariableManager get_vars() 28023 1726853646.58082: Calling all_inventory to load vars for managed_node3 28023 1726853646.58085: Calling groups_inventory to load vars for managed_node3 28023 1726853646.58087: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853646.58097: Calling all_plugins_play to load vars for managed_node3 28023 1726853646.58099: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853646.58102: Calling groups_plugins_play to load vars for managed_node3 28023 1726853646.59012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853646.59885: done with get_vars() 28023 1726853646.59905: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 13:34:06 -0400 (0:00:00.037) 0:00:38.683 ****** 28023 1726853646.59974: entering _queue_task() for managed_node3/include_tasks 28023 1726853646.60237: worker is 1 (out of 1 available) 28023 1726853646.60251: exiting _queue_task() for managed_node3/include_tasks 28023 1726853646.60266: done queuing things up, now waiting for results queue to drain 28023 1726853646.60268: waiting for pending results... 28023 1726853646.60451: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 28023 1726853646.60535: in run() - task 02083763-bbaf-fdb6-dad7-000000000a71 28023 1726853646.60545: variable 'ansible_search_path' from source: unknown 28023 1726853646.60548: variable 'ansible_search_path' from source: unknown 28023 1726853646.60580: calling self._execute() 28023 1726853646.60666: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853646.60670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853646.60680: variable 'omit' from source: magic vars 28023 1726853646.60949: variable 'ansible_distribution_major_version' from source: facts 28023 1726853646.60962: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853646.60966: _execute() done 28023 1726853646.60969: dumping result to json 28023 1726853646.60973: done dumping result, returning 28023 1726853646.60979: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-fdb6-dad7-000000000a71] 28023 1726853646.60984: sending task result for task 02083763-bbaf-fdb6-dad7-000000000a71 28023 1726853646.61072: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000a71 28023 1726853646.61075: WORKER PROCESS EXITING 28023 1726853646.61102: no more pending results, returning what we have 28023 1726853646.61106: in VariableManager get_vars() 28023 1726853646.61155: Calling all_inventory to load vars for managed_node3 28023 1726853646.61160: Calling groups_inventory to load vars for managed_node3 28023 1726853646.61162: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853646.61181: Calling all_plugins_play to load vars for managed_node3 28023 1726853646.61186: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853646.61189: Calling groups_plugins_play to load vars for managed_node3 28023 1726853646.61987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853646.62939: done with get_vars() 28023 1726853646.62953: variable 'ansible_search_path' from source: unknown 28023 1726853646.62954: variable 'ansible_search_path' from source: unknown 28023 1726853646.62984: we have included files to process 28023 1726853646.62985: generating all_blocks data 28023 1726853646.62986: done generating all_blocks data 28023 1726853646.62989: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28023 1726853646.62990: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28023 1726853646.62991: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28023 1726853646.63588: done processing included file 28023 1726853646.63589: iterating over new_blocks loaded from include file 28023 1726853646.63590: in VariableManager get_vars() 28023 1726853646.63604: done with get_vars() 28023 1726853646.63606: filtering new block on tags 28023 1726853646.63643: done filtering new block on tags 28023 1726853646.63645: in VariableManager get_vars() 28023 1726853646.63659: done with get_vars() 28023 1726853646.63661: filtering new block on tags 28023 1726853646.63695: done filtering new block on tags 28023 1726853646.63697: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 28023 1726853646.63701: extending task lists for all hosts with included blocks 28023 1726853646.63774: done extending task lists 28023 1726853646.63775: done processing included files 28023 1726853646.63776: results queue empty 28023 1726853646.63776: checking for any_errors_fatal 28023 1726853646.63778: done checking for any_errors_fatal 28023 1726853646.63779: checking for max_fail_percentage 28023 1726853646.63779: done checking for max_fail_percentage 28023 1726853646.63780: checking to see if all hosts have failed and the running result is not ok 28023 1726853646.63781: done checking to see if all hosts have failed 28023 1726853646.63781: getting the remaining hosts for this loop 28023 1726853646.63782: done getting the remaining hosts for this loop 28023 1726853646.63783: getting the next task for host managed_node3 28023 1726853646.63786: done getting next task for host managed_node3 28023 1726853646.63788: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 28023 1726853646.63790: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853646.63792: getting variables 28023 1726853646.63792: in VariableManager get_vars() 28023 1726853646.63801: Calling all_inventory to load vars for managed_node3 28023 1726853646.63802: Calling groups_inventory to load vars for managed_node3 28023 1726853646.63803: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853646.63807: Calling all_plugins_play to load vars for managed_node3 28023 1726853646.63808: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853646.63810: Calling groups_plugins_play to load vars for managed_node3 28023 1726853646.64459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853646.65307: done with get_vars() 28023 1726853646.65321: done getting variables 28023 1726853646.65346: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:34:06 -0400 (0:00:00.053) 0:00:38.737 ****** 28023 1726853646.65370: entering _queue_task() for managed_node3/set_fact 28023 1726853646.65632: worker is 1 (out of 1 available) 28023 1726853646.65645: exiting _queue_task() for managed_node3/set_fact 28023 1726853646.65659: done queuing things up, now waiting for results queue to drain 28023 1726853646.65661: waiting for pending results... 28023 1726853646.65841: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 28023 1726853646.65930: in run() - task 02083763-bbaf-fdb6-dad7-000000000b79 28023 1726853646.65941: variable 'ansible_search_path' from source: unknown 28023 1726853646.65944: variable 'ansible_search_path' from source: unknown 28023 1726853646.65976: calling self._execute() 28023 1726853646.66054: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853646.66061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853646.66069: variable 'omit' from source: magic vars 28023 1726853646.66343: variable 'ansible_distribution_major_version' from source: facts 28023 1726853646.66352: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853646.66360: variable 'omit' from source: magic vars 28023 1726853646.66397: variable 'omit' from source: magic vars 28023 1726853646.66420: variable 'omit' from source: magic vars 28023 1726853646.66454: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853646.66484: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853646.66500: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853646.66513: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853646.66522: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853646.66548: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853646.66553: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853646.66555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853646.66623: Set connection var ansible_shell_type to sh 28023 1726853646.66629: Set connection var ansible_shell_executable to /bin/sh 28023 1726853646.66636: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853646.66641: Set connection var ansible_connection to ssh 28023 1726853646.66648: Set connection var ansible_pipelining to False 28023 1726853646.66652: Set connection var ansible_timeout to 10 28023 1726853646.66676: variable 'ansible_shell_executable' from source: unknown 28023 1726853646.66680: variable 'ansible_connection' from source: unknown 28023 1726853646.66683: variable 'ansible_module_compression' from source: unknown 28023 1726853646.66685: variable 'ansible_shell_type' from source: unknown 28023 1726853646.66688: variable 'ansible_shell_executable' from source: unknown 28023 1726853646.66690: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853646.66692: variable 'ansible_pipelining' from source: unknown 28023 1726853646.66694: variable 'ansible_timeout' from source: unknown 28023 1726853646.66699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853646.66799: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853646.66809: variable 'omit' from source: magic vars 28023 1726853646.66814: starting attempt loop 28023 1726853646.66817: running the handler 28023 1726853646.66827: handler run complete 28023 1726853646.66835: attempt loop complete, returning result 28023 1726853646.66838: _execute() done 28023 1726853646.66841: dumping result to json 28023 1726853646.66843: done dumping result, returning 28023 1726853646.66849: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-fdb6-dad7-000000000b79] 28023 1726853646.66853: sending task result for task 02083763-bbaf-fdb6-dad7-000000000b79 28023 1726853646.66930: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000b79 28023 1726853646.66932: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 28023 1726853646.66988: no more pending results, returning what we have 28023 1726853646.66991: results queue empty 28023 1726853646.66992: checking for any_errors_fatal 28023 1726853646.66993: done checking for any_errors_fatal 28023 1726853646.66994: checking for max_fail_percentage 28023 1726853646.66995: done checking for max_fail_percentage 28023 1726853646.66996: checking to see if all hosts have failed and the running result is not ok 28023 1726853646.66997: done checking to see if all hosts have failed 28023 1726853646.66998: getting the remaining hosts for this loop 28023 1726853646.66999: done getting the remaining hosts for this loop 28023 1726853646.67002: getting the next task for host managed_node3 28023 1726853646.67011: done getting next task for host managed_node3 28023 1726853646.67014: ^ task is: TASK: Stat profile file 28023 1726853646.67019: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853646.67022: getting variables 28023 1726853646.67023: in VariableManager get_vars() 28023 1726853646.67066: Calling all_inventory to load vars for managed_node3 28023 1726853646.67069: Calling groups_inventory to load vars for managed_node3 28023 1726853646.67073: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853646.67084: Calling all_plugins_play to load vars for managed_node3 28023 1726853646.67087: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853646.67089: Calling groups_plugins_play to load vars for managed_node3 28023 1726853646.67937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853646.68798: done with get_vars() 28023 1726853646.68816: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:34:06 -0400 (0:00:00.035) 0:00:38.772 ****** 28023 1726853646.68886: entering _queue_task() for managed_node3/stat 28023 1726853646.69146: worker is 1 (out of 1 available) 28023 1726853646.69162: exiting _queue_task() for managed_node3/stat 28023 1726853646.69176: done queuing things up, now waiting for results queue to drain 28023 1726853646.69178: waiting for pending results... 28023 1726853646.69361: running TaskExecutor() for managed_node3/TASK: Stat profile file 28023 1726853646.69452: in run() - task 02083763-bbaf-fdb6-dad7-000000000b7a 28023 1726853646.69463: variable 'ansible_search_path' from source: unknown 28023 1726853646.69467: variable 'ansible_search_path' from source: unknown 28023 1726853646.69496: calling self._execute() 28023 1726853646.69580: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853646.69584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853646.69592: variable 'omit' from source: magic vars 28023 1726853646.69873: variable 'ansible_distribution_major_version' from source: facts 28023 1726853646.69886: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853646.69892: variable 'omit' from source: magic vars 28023 1726853646.69925: variable 'omit' from source: magic vars 28023 1726853646.69995: variable 'profile' from source: include params 28023 1726853646.69999: variable 'item' from source: include params 28023 1726853646.70042: variable 'item' from source: include params 28023 1726853646.70064: variable 'omit' from source: magic vars 28023 1726853646.70096: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853646.70124: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853646.70139: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853646.70152: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853646.70167: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853646.70191: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853646.70194: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853646.70197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853646.70263: Set connection var ansible_shell_type to sh 28023 1726853646.70267: Set connection var ansible_shell_executable to /bin/sh 28023 1726853646.70274: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853646.70285: Set connection var ansible_connection to ssh 28023 1726853646.70288: Set connection var ansible_pipelining to False 28023 1726853646.70291: Set connection var ansible_timeout to 10 28023 1726853646.70309: variable 'ansible_shell_executable' from source: unknown 28023 1726853646.70312: variable 'ansible_connection' from source: unknown 28023 1726853646.70315: variable 'ansible_module_compression' from source: unknown 28023 1726853646.70317: variable 'ansible_shell_type' from source: unknown 28023 1726853646.70320: variable 'ansible_shell_executable' from source: unknown 28023 1726853646.70322: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853646.70324: variable 'ansible_pipelining' from source: unknown 28023 1726853646.70327: variable 'ansible_timeout' from source: unknown 28023 1726853646.70332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853646.70475: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28023 1726853646.70484: variable 'omit' from source: magic vars 28023 1726853646.70489: starting attempt loop 28023 1726853646.70494: running the handler 28023 1726853646.70507: _low_level_execute_command(): starting 28023 1726853646.70513: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853646.71026: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853646.71030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853646.71033: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853646.71037: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853646.71093: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853646.71096: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853646.71098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853646.71173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853646.72900: stdout chunk (state=3): >>>/root <<< 28023 1726853646.73003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853646.73033: stderr chunk (state=3): >>><<< 28023 1726853646.73037: stdout chunk (state=3): >>><<< 28023 1726853646.73058: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853646.73074: _low_level_execute_command(): starting 28023 1726853646.73081: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853646.7306097-29750-92817081340967 `" && echo ansible-tmp-1726853646.7306097-29750-92817081340967="` echo /root/.ansible/tmp/ansible-tmp-1726853646.7306097-29750-92817081340967 `" ) && sleep 0' 28023 1726853646.73530: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853646.73534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853646.73543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853646.73546: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853646.73548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853646.73589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853646.73596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853646.73598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853646.73660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853646.75629: stdout chunk (state=3): >>>ansible-tmp-1726853646.7306097-29750-92817081340967=/root/.ansible/tmp/ansible-tmp-1726853646.7306097-29750-92817081340967 <<< 28023 1726853646.75737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853646.75763: stderr chunk (state=3): >>><<< 28023 1726853646.75766: stdout chunk (state=3): >>><<< 28023 1726853646.75784: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853646.7306097-29750-92817081340967=/root/.ansible/tmp/ansible-tmp-1726853646.7306097-29750-92817081340967 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853646.75822: variable 'ansible_module_compression' from source: unknown 28023 1726853646.75866: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28023 1726853646.75898: variable 'ansible_facts' from source: unknown 28023 1726853646.75960: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853646.7306097-29750-92817081340967/AnsiballZ_stat.py 28023 1726853646.76059: Sending initial data 28023 1726853646.76062: Sent initial data (152 bytes) 28023 1726853646.76511: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853646.76514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853646.76516: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853646.76518: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853646.76520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853646.76522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853646.76574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853646.76581: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853646.76642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853646.78279: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28023 1726853646.78283: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853646.78335: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853646.78396: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpmydtikdv /root/.ansible/tmp/ansible-tmp-1726853646.7306097-29750-92817081340967/AnsiballZ_stat.py <<< 28023 1726853646.78399: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853646.7306097-29750-92817081340967/AnsiballZ_stat.py" <<< 28023 1726853646.78453: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpmydtikdv" to remote "/root/.ansible/tmp/ansible-tmp-1726853646.7306097-29750-92817081340967/AnsiballZ_stat.py" <<< 28023 1726853646.78459: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853646.7306097-29750-92817081340967/AnsiballZ_stat.py" <<< 28023 1726853646.79048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853646.79091: stderr chunk (state=3): >>><<< 28023 1726853646.79094: stdout chunk (state=3): >>><<< 28023 1726853646.79126: done transferring module to remote 28023 1726853646.79135: _low_level_execute_command(): starting 28023 1726853646.79139: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853646.7306097-29750-92817081340967/ /root/.ansible/tmp/ansible-tmp-1726853646.7306097-29750-92817081340967/AnsiballZ_stat.py && sleep 0' 28023 1726853646.79548: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853646.79591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853646.79594: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853646.79597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853646.79603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853646.79605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853646.79645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853646.79648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853646.79652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853646.79707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853646.81559: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853646.81586: stderr chunk (state=3): >>><<< 28023 1726853646.81589: stdout chunk (state=3): >>><<< 28023 1726853646.81603: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853646.81606: _low_level_execute_command(): starting 28023 1726853646.81610: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853646.7306097-29750-92817081340967/AnsiballZ_stat.py && sleep 0' 28023 1726853646.82032: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853646.82070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853646.82076: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853646.82078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853646.82080: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853646.82082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853646.82127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853646.82130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853646.82134: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853646.82201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853646.97851: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28023 1726853646.99183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853646.99212: stderr chunk (state=3): >>><<< 28023 1726853646.99215: stdout chunk (state=3): >>><<< 28023 1726853646.99229: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853646.99258: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853646.7306097-29750-92817081340967/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853646.99264: _low_level_execute_command(): starting 28023 1726853646.99274: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853646.7306097-29750-92817081340967/ > /dev/null 2>&1 && sleep 0' 28023 1726853646.99730: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853646.99733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853646.99736: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853646.99742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853646.99745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853646.99777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853646.99790: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853646.99855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853647.01737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853647.01765: stderr chunk (state=3): >>><<< 28023 1726853647.01768: stdout chunk (state=3): >>><<< 28023 1726853647.01785: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853647.01790: handler run complete 28023 1726853647.01806: attempt loop complete, returning result 28023 1726853647.01809: _execute() done 28023 1726853647.01811: dumping result to json 28023 1726853647.01815: done dumping result, returning 28023 1726853647.01823: done running TaskExecutor() for managed_node3/TASK: Stat profile file [02083763-bbaf-fdb6-dad7-000000000b7a] 28023 1726853647.01827: sending task result for task 02083763-bbaf-fdb6-dad7-000000000b7a 28023 1726853647.01923: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000b7a 28023 1726853647.01926: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 28023 1726853647.01983: no more pending results, returning what we have 28023 1726853647.01986: results queue empty 28023 1726853647.01987: checking for any_errors_fatal 28023 1726853647.01995: done checking for any_errors_fatal 28023 1726853647.01995: checking for max_fail_percentage 28023 1726853647.01997: done checking for max_fail_percentage 28023 1726853647.01998: checking to see if all hosts have failed and the running result is not ok 28023 1726853647.01999: done checking to see if all hosts have failed 28023 1726853647.02000: getting the remaining hosts for this loop 28023 1726853647.02001: done getting the remaining hosts for this loop 28023 1726853647.02004: getting the next task for host managed_node3 28023 1726853647.02012: done getting next task for host managed_node3 28023 1726853647.02014: ^ task is: TASK: Set NM profile exist flag based on the profile files 28023 1726853647.02019: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853647.02024: getting variables 28023 1726853647.02025: in VariableManager get_vars() 28023 1726853647.02069: Calling all_inventory to load vars for managed_node3 28023 1726853647.02079: Calling groups_inventory to load vars for managed_node3 28023 1726853647.02082: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853647.02093: Calling all_plugins_play to load vars for managed_node3 28023 1726853647.02096: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853647.02099: Calling groups_plugins_play to load vars for managed_node3 28023 1726853647.02899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853647.03786: done with get_vars() 28023 1726853647.03802: done getting variables 28023 1726853647.03846: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:34:07 -0400 (0:00:00.349) 0:00:39.122 ****** 28023 1726853647.03873: entering _queue_task() for managed_node3/set_fact 28023 1726853647.04114: worker is 1 (out of 1 available) 28023 1726853647.04126: exiting _queue_task() for managed_node3/set_fact 28023 1726853647.04138: done queuing things up, now waiting for results queue to drain 28023 1726853647.04139: waiting for pending results... 28023 1726853647.04322: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 28023 1726853647.04398: in run() - task 02083763-bbaf-fdb6-dad7-000000000b7b 28023 1726853647.04410: variable 'ansible_search_path' from source: unknown 28023 1726853647.04414: variable 'ansible_search_path' from source: unknown 28023 1726853647.04441: calling self._execute() 28023 1726853647.04524: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853647.04530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853647.04538: variable 'omit' from source: magic vars 28023 1726853647.04813: variable 'ansible_distribution_major_version' from source: facts 28023 1726853647.04823: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853647.04905: variable 'profile_stat' from source: set_fact 28023 1726853647.04917: Evaluated conditional (profile_stat.stat.exists): False 28023 1726853647.04921: when evaluation is False, skipping this task 28023 1726853647.04923: _execute() done 28023 1726853647.04926: dumping result to json 28023 1726853647.04928: done dumping result, returning 28023 1726853647.04933: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-fdb6-dad7-000000000b7b] 28023 1726853647.04938: sending task result for task 02083763-bbaf-fdb6-dad7-000000000b7b 28023 1726853647.05022: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000b7b 28023 1726853647.05024: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28023 1726853647.05083: no more pending results, returning what we have 28023 1726853647.05087: results queue empty 28023 1726853647.05088: checking for any_errors_fatal 28023 1726853647.05100: done checking for any_errors_fatal 28023 1726853647.05101: checking for max_fail_percentage 28023 1726853647.05102: done checking for max_fail_percentage 28023 1726853647.05103: checking to see if all hosts have failed and the running result is not ok 28023 1726853647.05104: done checking to see if all hosts have failed 28023 1726853647.05105: getting the remaining hosts for this loop 28023 1726853647.05107: done getting the remaining hosts for this loop 28023 1726853647.05110: getting the next task for host managed_node3 28023 1726853647.05118: done getting next task for host managed_node3 28023 1726853647.05121: ^ task is: TASK: Get NM profile info 28023 1726853647.05125: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853647.05128: getting variables 28023 1726853647.05130: in VariableManager get_vars() 28023 1726853647.05168: Calling all_inventory to load vars for managed_node3 28023 1726853647.05172: Calling groups_inventory to load vars for managed_node3 28023 1726853647.05174: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853647.05185: Calling all_plugins_play to load vars for managed_node3 28023 1726853647.05188: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853647.05190: Calling groups_plugins_play to load vars for managed_node3 28023 1726853647.06455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853647.07391: done with get_vars() 28023 1726853647.07409: done getting variables 28023 1726853647.07456: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:34:07 -0400 (0:00:00.036) 0:00:39.158 ****** 28023 1726853647.07484: entering _queue_task() for managed_node3/shell 28023 1726853647.07741: worker is 1 (out of 1 available) 28023 1726853647.07754: exiting _queue_task() for managed_node3/shell 28023 1726853647.07766: done queuing things up, now waiting for results queue to drain 28023 1726853647.07767: waiting for pending results... 28023 1726853647.07955: running TaskExecutor() for managed_node3/TASK: Get NM profile info 28023 1726853647.08031: in run() - task 02083763-bbaf-fdb6-dad7-000000000b7c 28023 1726853647.08043: variable 'ansible_search_path' from source: unknown 28023 1726853647.08046: variable 'ansible_search_path' from source: unknown 28023 1726853647.08080: calling self._execute() 28023 1726853647.08162: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853647.08170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853647.08179: variable 'omit' from source: magic vars 28023 1726853647.08501: variable 'ansible_distribution_major_version' from source: facts 28023 1726853647.08520: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853647.08523: variable 'omit' from source: magic vars 28023 1726853647.08676: variable 'omit' from source: magic vars 28023 1726853647.08679: variable 'profile' from source: include params 28023 1726853647.08682: variable 'item' from source: include params 28023 1726853647.08729: variable 'item' from source: include params 28023 1726853647.08750: variable 'omit' from source: magic vars 28023 1726853647.08907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853647.08911: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853647.08914: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853647.08916: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853647.08918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853647.08920: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853647.08922: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853647.08925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853647.09283: Set connection var ansible_shell_type to sh 28023 1726853647.09286: Set connection var ansible_shell_executable to /bin/sh 28023 1726853647.09289: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853647.09291: Set connection var ansible_connection to ssh 28023 1726853647.09293: Set connection var ansible_pipelining to False 28023 1726853647.09296: Set connection var ansible_timeout to 10 28023 1726853647.09298: variable 'ansible_shell_executable' from source: unknown 28023 1726853647.09300: variable 'ansible_connection' from source: unknown 28023 1726853647.09301: variable 'ansible_module_compression' from source: unknown 28023 1726853647.09303: variable 'ansible_shell_type' from source: unknown 28023 1726853647.09305: variable 'ansible_shell_executable' from source: unknown 28023 1726853647.09307: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853647.09309: variable 'ansible_pipelining' from source: unknown 28023 1726853647.09311: variable 'ansible_timeout' from source: unknown 28023 1726853647.09313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853647.09315: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853647.09318: variable 'omit' from source: magic vars 28023 1726853647.09319: starting attempt loop 28023 1726853647.09321: running the handler 28023 1726853647.09323: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853647.09326: _low_level_execute_command(): starting 28023 1726853647.09328: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853647.09935: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853647.09947: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853647.09958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853647.09977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853647.09990: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853647.09996: stderr chunk (state=3): >>>debug2: match not found <<< 28023 1726853647.10006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853647.10095: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853647.10113: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853647.10206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853647.11920: stdout chunk (state=3): >>>/root <<< 28023 1726853647.12073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853647.12077: stdout chunk (state=3): >>><<< 28023 1726853647.12079: stderr chunk (state=3): >>><<< 28023 1726853647.12098: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853647.12208: _low_level_execute_command(): starting 28023 1726853647.12212: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853647.1210785-29760-197134573090650 `" && echo ansible-tmp-1726853647.1210785-29760-197134573090650="` echo /root/.ansible/tmp/ansible-tmp-1726853647.1210785-29760-197134573090650 `" ) && sleep 0' 28023 1726853647.12807: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853647.12828: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853647.12856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853647.12900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853647.12988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853647.13036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853647.13090: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853647.13109: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853647.13200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853647.15219: stdout chunk (state=3): >>>ansible-tmp-1726853647.1210785-29760-197134573090650=/root/.ansible/tmp/ansible-tmp-1726853647.1210785-29760-197134573090650 <<< 28023 1726853647.15381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853647.15384: stdout chunk (state=3): >>><<< 28023 1726853647.15386: stderr chunk (state=3): >>><<< 28023 1726853647.15403: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853647.1210785-29760-197134573090650=/root/.ansible/tmp/ansible-tmp-1726853647.1210785-29760-197134573090650 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853647.15440: variable 'ansible_module_compression' from source: unknown 28023 1726853647.15579: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28023 1726853647.15583: variable 'ansible_facts' from source: unknown 28023 1726853647.15630: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853647.1210785-29760-197134573090650/AnsiballZ_command.py 28023 1726853647.15811: Sending initial data 28023 1726853647.15820: Sent initial data (156 bytes) 28023 1726853647.16415: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853647.16429: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853647.16449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853647.16468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853647.16567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853647.16586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853647.16607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853647.16707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853647.18424: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853647.18485: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853647.18547: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpf_dzbsm7 /root/.ansible/tmp/ansible-tmp-1726853647.1210785-29760-197134573090650/AnsiballZ_command.py <<< 28023 1726853647.18551: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853647.1210785-29760-197134573090650/AnsiballZ_command.py" <<< 28023 1726853647.18601: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpf_dzbsm7" to remote "/root/.ansible/tmp/ansible-tmp-1726853647.1210785-29760-197134573090650/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853647.1210785-29760-197134573090650/AnsiballZ_command.py" <<< 28023 1726853647.20368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853647.20376: stdout chunk (state=3): >>><<< 28023 1726853647.20478: stderr chunk (state=3): >>><<< 28023 1726853647.20485: done transferring module to remote 28023 1726853647.20489: _low_level_execute_command(): starting 28023 1726853647.20493: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853647.1210785-29760-197134573090650/ /root/.ansible/tmp/ansible-tmp-1726853647.1210785-29760-197134573090650/AnsiballZ_command.py && sleep 0' 28023 1726853647.21451: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853647.21467: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853647.21491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853647.21517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853647.21676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853647.21691: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853647.21722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853647.21827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853647.23736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853647.23797: stderr chunk (state=3): >>><<< 28023 1726853647.23818: stdout chunk (state=3): >>><<< 28023 1726853647.23839: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853647.23847: _low_level_execute_command(): starting 28023 1726853647.23856: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853647.1210785-29760-197134573090650/AnsiballZ_command.py && sleep 0' 28023 1726853647.24474: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853647.24489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853647.24504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853647.24527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853647.24546: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853647.24583: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853647.24658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853647.24678: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853647.24699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853647.24798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853647.42163: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "start": "2024-09-20 13:34:07.402970", "end": "2024-09-20 13:34:07.420357", "delta": "0:00:00.017387", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28023 1726853647.43880: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.217 closed. <<< 28023 1726853647.43884: stdout chunk (state=3): >>><<< 28023 1726853647.43887: stderr chunk (state=3): >>><<< 28023 1726853647.43889: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "start": "2024-09-20 13:34:07.402970", "end": "2024-09-20 13:34:07.420357", "delta": "0:00:00.017387", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.217 closed. 28023 1726853647.43891: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853647.1210785-29760-197134573090650/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853647.43977: _low_level_execute_command(): starting 28023 1726853647.43981: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853647.1210785-29760-197134573090650/ > /dev/null 2>&1 && sleep 0' 28023 1726853647.44586: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853647.44602: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853647.44617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853647.44638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853647.44682: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853647.44695: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853647.44773: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853647.44801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853647.44817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853647.44914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853647.46810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853647.46876: stderr chunk (state=3): >>><<< 28023 1726853647.46880: stdout chunk (state=3): >>><<< 28023 1726853647.47077: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853647.47080: handler run complete 28023 1726853647.47083: Evaluated conditional (False): False 28023 1726853647.47086: attempt loop complete, returning result 28023 1726853647.47088: _execute() done 28023 1726853647.47090: dumping result to json 28023 1726853647.47093: done dumping result, returning 28023 1726853647.47095: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [02083763-bbaf-fdb6-dad7-000000000b7c] 28023 1726853647.47097: sending task result for task 02083763-bbaf-fdb6-dad7-000000000b7c 28023 1726853647.47170: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000b7c 28023 1726853647.47180: WORKER PROCESS EXITING fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "delta": "0:00:00.017387", "end": "2024-09-20 13:34:07.420357", "rc": 1, "start": "2024-09-20 13:34:07.402970" } MSG: non-zero return code ...ignoring 28023 1726853647.47261: no more pending results, returning what we have 28023 1726853647.47264: results queue empty 28023 1726853647.47265: checking for any_errors_fatal 28023 1726853647.47275: done checking for any_errors_fatal 28023 1726853647.47276: checking for max_fail_percentage 28023 1726853647.47278: done checking for max_fail_percentage 28023 1726853647.47279: checking to see if all hosts have failed and the running result is not ok 28023 1726853647.47284: done checking to see if all hosts have failed 28023 1726853647.47285: getting the remaining hosts for this loop 28023 1726853647.47287: done getting the remaining hosts for this loop 28023 1726853647.47295: getting the next task for host managed_node3 28023 1726853647.47305: done getting next task for host managed_node3 28023 1726853647.47308: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 28023 1726853647.47314: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853647.47319: getting variables 28023 1726853647.47320: in VariableManager get_vars() 28023 1726853647.47365: Calling all_inventory to load vars for managed_node3 28023 1726853647.47369: Calling groups_inventory to load vars for managed_node3 28023 1726853647.47515: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853647.47528: Calling all_plugins_play to load vars for managed_node3 28023 1726853647.47531: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853647.47534: Calling groups_plugins_play to load vars for managed_node3 28023 1726853647.49115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853647.54949: done with get_vars() 28023 1726853647.54966: done getting variables 28023 1726853647.55004: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:34:07 -0400 (0:00:00.475) 0:00:39.634 ****** 28023 1726853647.55024: entering _queue_task() for managed_node3/set_fact 28023 1726853647.55286: worker is 1 (out of 1 available) 28023 1726853647.55298: exiting _queue_task() for managed_node3/set_fact 28023 1726853647.55310: done queuing things up, now waiting for results queue to drain 28023 1726853647.55312: waiting for pending results... 28023 1726853647.55500: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 28023 1726853647.55588: in run() - task 02083763-bbaf-fdb6-dad7-000000000b7d 28023 1726853647.55600: variable 'ansible_search_path' from source: unknown 28023 1726853647.55604: variable 'ansible_search_path' from source: unknown 28023 1726853647.55632: calling self._execute() 28023 1726853647.55719: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853647.55723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853647.55731: variable 'omit' from source: magic vars 28023 1726853647.56016: variable 'ansible_distribution_major_version' from source: facts 28023 1726853647.56026: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853647.56183: variable 'nm_profile_exists' from source: set_fact 28023 1726853647.56187: Evaluated conditional (nm_profile_exists.rc == 0): False 28023 1726853647.56193: when evaluation is False, skipping this task 28023 1726853647.56195: _execute() done 28023 1726853647.56197: dumping result to json 28023 1726853647.56200: done dumping result, returning 28023 1726853647.56203: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-fdb6-dad7-000000000b7d] 28023 1726853647.56206: sending task result for task 02083763-bbaf-fdb6-dad7-000000000b7d 28023 1726853647.56316: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000b7d 28023 1726853647.56319: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 28023 1726853647.56404: no more pending results, returning what we have 28023 1726853647.56407: results queue empty 28023 1726853647.56408: checking for any_errors_fatal 28023 1726853647.56414: done checking for any_errors_fatal 28023 1726853647.56414: checking for max_fail_percentage 28023 1726853647.56416: done checking for max_fail_percentage 28023 1726853647.56417: checking to see if all hosts have failed and the running result is not ok 28023 1726853647.56418: done checking to see if all hosts have failed 28023 1726853647.56418: getting the remaining hosts for this loop 28023 1726853647.56420: done getting the remaining hosts for this loop 28023 1726853647.56422: getting the next task for host managed_node3 28023 1726853647.56431: done getting next task for host managed_node3 28023 1726853647.56433: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 28023 1726853647.56438: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853647.56442: getting variables 28023 1726853647.56443: in VariableManager get_vars() 28023 1726853647.56481: Calling all_inventory to load vars for managed_node3 28023 1726853647.56483: Calling groups_inventory to load vars for managed_node3 28023 1726853647.56485: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853647.56495: Calling all_plugins_play to load vars for managed_node3 28023 1726853647.56497: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853647.56500: Calling groups_plugins_play to load vars for managed_node3 28023 1726853647.57679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853647.58547: done with get_vars() 28023 1726853647.58562: done getting variables 28023 1726853647.58607: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853647.58692: variable 'profile' from source: include params 28023 1726853647.58695: variable 'item' from source: include params 28023 1726853647.58737: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-ethtest1] *********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:34:07 -0400 (0:00:00.037) 0:00:39.671 ****** 28023 1726853647.58762: entering _queue_task() for managed_node3/command 28023 1726853647.59001: worker is 1 (out of 1 available) 28023 1726853647.59014: exiting _queue_task() for managed_node3/command 28023 1726853647.59026: done queuing things up, now waiting for results queue to drain 28023 1726853647.59027: waiting for pending results... 28023 1726853647.59218: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-ethtest1 28023 1726853647.59304: in run() - task 02083763-bbaf-fdb6-dad7-000000000b7f 28023 1726853647.59315: variable 'ansible_search_path' from source: unknown 28023 1726853647.59318: variable 'ansible_search_path' from source: unknown 28023 1726853647.59349: calling self._execute() 28023 1726853647.59433: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853647.59440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853647.59448: variable 'omit' from source: magic vars 28023 1726853647.59723: variable 'ansible_distribution_major_version' from source: facts 28023 1726853647.59733: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853647.59819: variable 'profile_stat' from source: set_fact 28023 1726853647.59827: Evaluated conditional (profile_stat.stat.exists): False 28023 1726853647.59831: when evaluation is False, skipping this task 28023 1726853647.59833: _execute() done 28023 1726853647.59836: dumping result to json 28023 1726853647.59840: done dumping result, returning 28023 1726853647.59846: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-ethtest1 [02083763-bbaf-fdb6-dad7-000000000b7f] 28023 1726853647.59851: sending task result for task 02083763-bbaf-fdb6-dad7-000000000b7f 28023 1726853647.59940: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000b7f 28023 1726853647.59943: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28023 1726853647.59994: no more pending results, returning what we have 28023 1726853647.59998: results queue empty 28023 1726853647.59999: checking for any_errors_fatal 28023 1726853647.60005: done checking for any_errors_fatal 28023 1726853647.60005: checking for max_fail_percentage 28023 1726853647.60007: done checking for max_fail_percentage 28023 1726853647.60008: checking to see if all hosts have failed and the running result is not ok 28023 1726853647.60009: done checking to see if all hosts have failed 28023 1726853647.60009: getting the remaining hosts for this loop 28023 1726853647.60011: done getting the remaining hosts for this loop 28023 1726853647.60014: getting the next task for host managed_node3 28023 1726853647.60023: done getting next task for host managed_node3 28023 1726853647.60026: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 28023 1726853647.60031: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853647.60034: getting variables 28023 1726853647.60035: in VariableManager get_vars() 28023 1726853647.60076: Calling all_inventory to load vars for managed_node3 28023 1726853647.60079: Calling groups_inventory to load vars for managed_node3 28023 1726853647.60081: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853647.60092: Calling all_plugins_play to load vars for managed_node3 28023 1726853647.60095: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853647.60098: Calling groups_plugins_play to load vars for managed_node3 28023 1726853647.61001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853647.61863: done with get_vars() 28023 1726853647.61881: done getting variables 28023 1726853647.61926: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853647.62006: variable 'profile' from source: include params 28023 1726853647.62010: variable 'item' from source: include params 28023 1726853647.62051: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-ethtest1] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:34:07 -0400 (0:00:00.033) 0:00:39.704 ****** 28023 1726853647.62077: entering _queue_task() for managed_node3/set_fact 28023 1726853647.62331: worker is 1 (out of 1 available) 28023 1726853647.62347: exiting _queue_task() for managed_node3/set_fact 28023 1726853647.62359: done queuing things up, now waiting for results queue to drain 28023 1726853647.62360: waiting for pending results... 28023 1726853647.62543: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-ethtest1 28023 1726853647.62628: in run() - task 02083763-bbaf-fdb6-dad7-000000000b80 28023 1726853647.62641: variable 'ansible_search_path' from source: unknown 28023 1726853647.62644: variable 'ansible_search_path' from source: unknown 28023 1726853647.62680: calling self._execute() 28023 1726853647.62765: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853647.62769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853647.62779: variable 'omit' from source: magic vars 28023 1726853647.63055: variable 'ansible_distribution_major_version' from source: facts 28023 1726853647.63067: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853647.63151: variable 'profile_stat' from source: set_fact 28023 1726853647.63161: Evaluated conditional (profile_stat.stat.exists): False 28023 1726853647.63164: when evaluation is False, skipping this task 28023 1726853647.63167: _execute() done 28023 1726853647.63174: dumping result to json 28023 1726853647.63176: done dumping result, returning 28023 1726853647.63182: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-ethtest1 [02083763-bbaf-fdb6-dad7-000000000b80] 28023 1726853647.63188: sending task result for task 02083763-bbaf-fdb6-dad7-000000000b80 28023 1726853647.63273: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000b80 28023 1726853647.63276: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28023 1726853647.63323: no more pending results, returning what we have 28023 1726853647.63326: results queue empty 28023 1726853647.63327: checking for any_errors_fatal 28023 1726853647.63334: done checking for any_errors_fatal 28023 1726853647.63334: checking for max_fail_percentage 28023 1726853647.63337: done checking for max_fail_percentage 28023 1726853647.63338: checking to see if all hosts have failed and the running result is not ok 28023 1726853647.63339: done checking to see if all hosts have failed 28023 1726853647.63339: getting the remaining hosts for this loop 28023 1726853647.63341: done getting the remaining hosts for this loop 28023 1726853647.63344: getting the next task for host managed_node3 28023 1726853647.63353: done getting next task for host managed_node3 28023 1726853647.63355: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 28023 1726853647.63362: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853647.63366: getting variables 28023 1726853647.63367: in VariableManager get_vars() 28023 1726853647.63409: Calling all_inventory to load vars for managed_node3 28023 1726853647.63412: Calling groups_inventory to load vars for managed_node3 28023 1726853647.63414: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853647.63424: Calling all_plugins_play to load vars for managed_node3 28023 1726853647.63426: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853647.63429: Calling groups_plugins_play to load vars for managed_node3 28023 1726853647.64195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853647.65066: done with get_vars() 28023 1726853647.65083: done getting variables 28023 1726853647.65126: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853647.65201: variable 'profile' from source: include params 28023 1726853647.65204: variable 'item' from source: include params 28023 1726853647.65243: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-ethtest1] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:34:07 -0400 (0:00:00.031) 0:00:39.736 ****** 28023 1726853647.65267: entering _queue_task() for managed_node3/command 28023 1726853647.65496: worker is 1 (out of 1 available) 28023 1726853647.65511: exiting _queue_task() for managed_node3/command 28023 1726853647.65525: done queuing things up, now waiting for results queue to drain 28023 1726853647.65526: waiting for pending results... 28023 1726853647.65700: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-ethtest1 28023 1726853647.65797: in run() - task 02083763-bbaf-fdb6-dad7-000000000b81 28023 1726853647.65809: variable 'ansible_search_path' from source: unknown 28023 1726853647.65812: variable 'ansible_search_path' from source: unknown 28023 1726853647.65841: calling self._execute() 28023 1726853647.65923: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853647.65926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853647.65935: variable 'omit' from source: magic vars 28023 1726853647.66194: variable 'ansible_distribution_major_version' from source: facts 28023 1726853647.66205: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853647.66284: variable 'profile_stat' from source: set_fact 28023 1726853647.66294: Evaluated conditional (profile_stat.stat.exists): False 28023 1726853647.66298: when evaluation is False, skipping this task 28023 1726853647.66300: _execute() done 28023 1726853647.66302: dumping result to json 28023 1726853647.66305: done dumping result, returning 28023 1726853647.66315: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-ethtest1 [02083763-bbaf-fdb6-dad7-000000000b81] 28023 1726853647.66317: sending task result for task 02083763-bbaf-fdb6-dad7-000000000b81 28023 1726853647.66399: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000b81 28023 1726853647.66402: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28023 1726853647.66461: no more pending results, returning what we have 28023 1726853647.66464: results queue empty 28023 1726853647.66465: checking for any_errors_fatal 28023 1726853647.66475: done checking for any_errors_fatal 28023 1726853647.66475: checking for max_fail_percentage 28023 1726853647.66477: done checking for max_fail_percentage 28023 1726853647.66478: checking to see if all hosts have failed and the running result is not ok 28023 1726853647.66479: done checking to see if all hosts have failed 28023 1726853647.66480: getting the remaining hosts for this loop 28023 1726853647.66481: done getting the remaining hosts for this loop 28023 1726853647.66484: getting the next task for host managed_node3 28023 1726853647.66492: done getting next task for host managed_node3 28023 1726853647.66495: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 28023 1726853647.66500: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853647.66503: getting variables 28023 1726853647.66504: in VariableManager get_vars() 28023 1726853647.66538: Calling all_inventory to load vars for managed_node3 28023 1726853647.66540: Calling groups_inventory to load vars for managed_node3 28023 1726853647.66542: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853647.66553: Calling all_plugins_play to load vars for managed_node3 28023 1726853647.66555: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853647.66558: Calling groups_plugins_play to load vars for managed_node3 28023 1726853647.67435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853647.68289: done with get_vars() 28023 1726853647.68302: done getting variables 28023 1726853647.68343: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853647.68419: variable 'profile' from source: include params 28023 1726853647.68422: variable 'item' from source: include params 28023 1726853647.68459: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-ethtest1] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:34:07 -0400 (0:00:00.032) 0:00:39.768 ****** 28023 1726853647.68484: entering _queue_task() for managed_node3/set_fact 28023 1726853647.68701: worker is 1 (out of 1 available) 28023 1726853647.68715: exiting _queue_task() for managed_node3/set_fact 28023 1726853647.68727: done queuing things up, now waiting for results queue to drain 28023 1726853647.68728: waiting for pending results... 28023 1726853647.68906: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-ethtest1 28023 1726853647.68999: in run() - task 02083763-bbaf-fdb6-dad7-000000000b82 28023 1726853647.69009: variable 'ansible_search_path' from source: unknown 28023 1726853647.69013: variable 'ansible_search_path' from source: unknown 28023 1726853647.69041: calling self._execute() 28023 1726853647.69122: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853647.69126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853647.69133: variable 'omit' from source: magic vars 28023 1726853647.69400: variable 'ansible_distribution_major_version' from source: facts 28023 1726853647.69410: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853647.69492: variable 'profile_stat' from source: set_fact 28023 1726853647.69502: Evaluated conditional (profile_stat.stat.exists): False 28023 1726853647.69505: when evaluation is False, skipping this task 28023 1726853647.69508: _execute() done 28023 1726853647.69512: dumping result to json 28023 1726853647.69514: done dumping result, returning 28023 1726853647.69517: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-ethtest1 [02083763-bbaf-fdb6-dad7-000000000b82] 28023 1726853647.69522: sending task result for task 02083763-bbaf-fdb6-dad7-000000000b82 28023 1726853647.69616: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000b82 28023 1726853647.69619: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28023 1726853647.69667: no more pending results, returning what we have 28023 1726853647.69672: results queue empty 28023 1726853647.69673: checking for any_errors_fatal 28023 1726853647.69680: done checking for any_errors_fatal 28023 1726853647.69680: checking for max_fail_percentage 28023 1726853647.69682: done checking for max_fail_percentage 28023 1726853647.69683: checking to see if all hosts have failed and the running result is not ok 28023 1726853647.69684: done checking to see if all hosts have failed 28023 1726853647.69684: getting the remaining hosts for this loop 28023 1726853647.69686: done getting the remaining hosts for this loop 28023 1726853647.69689: getting the next task for host managed_node3 28023 1726853647.69697: done getting next task for host managed_node3 28023 1726853647.69700: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 28023 1726853647.69704: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853647.69708: getting variables 28023 1726853647.69714: in VariableManager get_vars() 28023 1726853647.69753: Calling all_inventory to load vars for managed_node3 28023 1726853647.69755: Calling groups_inventory to load vars for managed_node3 28023 1726853647.69757: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853647.69768: Calling all_plugins_play to load vars for managed_node3 28023 1726853647.69772: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853647.69775: Calling groups_plugins_play to load vars for managed_node3 28023 1726853647.70523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853647.71399: done with get_vars() 28023 1726853647.71415: done getting variables 28023 1726853647.71459: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28023 1726853647.71540: variable 'profile' from source: include params 28023 1726853647.71543: variable 'item' from source: include params 28023 1726853647.71585: variable 'item' from source: include params TASK [Assert that the profile is absent - 'ethtest1'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 13:34:07 -0400 (0:00:00.031) 0:00:39.799 ****** 28023 1726853647.71607: entering _queue_task() for managed_node3/assert 28023 1726853647.71844: worker is 1 (out of 1 available) 28023 1726853647.71858: exiting _queue_task() for managed_node3/assert 28023 1726853647.71870: done queuing things up, now waiting for results queue to drain 28023 1726853647.71873: waiting for pending results... 28023 1726853647.72050: running TaskExecutor() for managed_node3/TASK: Assert that the profile is absent - 'ethtest1' 28023 1726853647.72130: in run() - task 02083763-bbaf-fdb6-dad7-000000000a72 28023 1726853647.72140: variable 'ansible_search_path' from source: unknown 28023 1726853647.72144: variable 'ansible_search_path' from source: unknown 28023 1726853647.72176: calling self._execute() 28023 1726853647.72265: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853647.72268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853647.72279: variable 'omit' from source: magic vars 28023 1726853647.72569: variable 'ansible_distribution_major_version' from source: facts 28023 1726853647.72581: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853647.72586: variable 'omit' from source: magic vars 28023 1726853647.72615: variable 'omit' from source: magic vars 28023 1726853647.72688: variable 'profile' from source: include params 28023 1726853647.72692: variable 'item' from source: include params 28023 1726853647.72735: variable 'item' from source: include params 28023 1726853647.72753: variable 'omit' from source: magic vars 28023 1726853647.72789: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853647.72816: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853647.72832: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853647.72845: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853647.72856: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853647.72884: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853647.72888: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853647.72890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853647.72956: Set connection var ansible_shell_type to sh 28023 1726853647.72966: Set connection var ansible_shell_executable to /bin/sh 28023 1726853647.72977: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853647.72981: Set connection var ansible_connection to ssh 28023 1726853647.72983: Set connection var ansible_pipelining to False 28023 1726853647.72989: Set connection var ansible_timeout to 10 28023 1726853647.73009: variable 'ansible_shell_executable' from source: unknown 28023 1726853647.73012: variable 'ansible_connection' from source: unknown 28023 1726853647.73014: variable 'ansible_module_compression' from source: unknown 28023 1726853647.73017: variable 'ansible_shell_type' from source: unknown 28023 1726853647.73019: variable 'ansible_shell_executable' from source: unknown 28023 1726853647.73021: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853647.73024: variable 'ansible_pipelining' from source: unknown 28023 1726853647.73027: variable 'ansible_timeout' from source: unknown 28023 1726853647.73031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853647.73134: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853647.73144: variable 'omit' from source: magic vars 28023 1726853647.73149: starting attempt loop 28023 1726853647.73152: running the handler 28023 1726853647.73238: variable 'lsr_net_profile_exists' from source: set_fact 28023 1726853647.73241: Evaluated conditional (not lsr_net_profile_exists): True 28023 1726853647.73247: handler run complete 28023 1726853647.73258: attempt loop complete, returning result 28023 1726853647.73263: _execute() done 28023 1726853647.73265: dumping result to json 28023 1726853647.73268: done dumping result, returning 28023 1726853647.73277: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is absent - 'ethtest1' [02083763-bbaf-fdb6-dad7-000000000a72] 28023 1726853647.73281: sending task result for task 02083763-bbaf-fdb6-dad7-000000000a72 28023 1726853647.73360: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000a72 28023 1726853647.73363: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 28023 1726853647.73447: no more pending results, returning what we have 28023 1726853647.73450: results queue empty 28023 1726853647.73451: checking for any_errors_fatal 28023 1726853647.73459: done checking for any_errors_fatal 28023 1726853647.73460: checking for max_fail_percentage 28023 1726853647.73462: done checking for max_fail_percentage 28023 1726853647.73463: checking to see if all hosts have failed and the running result is not ok 28023 1726853647.73464: done checking to see if all hosts have failed 28023 1726853647.73464: getting the remaining hosts for this loop 28023 1726853647.73466: done getting the remaining hosts for this loop 28023 1726853647.73469: getting the next task for host managed_node3 28023 1726853647.73479: done getting next task for host managed_node3 28023 1726853647.73483: ^ task is: TASK: Verify network state restored to default 28023 1726853647.73485: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853647.73489: getting variables 28023 1726853647.73491: in VariableManager get_vars() 28023 1726853647.73527: Calling all_inventory to load vars for managed_node3 28023 1726853647.73530: Calling groups_inventory to load vars for managed_node3 28023 1726853647.73532: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853647.73542: Calling all_plugins_play to load vars for managed_node3 28023 1726853647.73544: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853647.73547: Calling groups_plugins_play to load vars for managed_node3 28023 1726853647.74917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853647.75860: done with get_vars() 28023 1726853647.75879: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:169 Friday 20 September 2024 13:34:07 -0400 (0:00:00.043) 0:00:39.843 ****** 28023 1726853647.75991: entering _queue_task() for managed_node3/include_tasks 28023 1726853647.76497: worker is 1 (out of 1 available) 28023 1726853647.76506: exiting _queue_task() for managed_node3/include_tasks 28023 1726853647.76516: done queuing things up, now waiting for results queue to drain 28023 1726853647.76518: waiting for pending results... 28023 1726853647.76647: running TaskExecutor() for managed_node3/TASK: Verify network state restored to default 28023 1726853647.76822: in run() - task 02083763-bbaf-fdb6-dad7-0000000000bb 28023 1726853647.76826: variable 'ansible_search_path' from source: unknown 28023 1726853647.76858: calling self._execute() 28023 1726853647.76988: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853647.77029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853647.77033: variable 'omit' from source: magic vars 28023 1726853647.77439: variable 'ansible_distribution_major_version' from source: facts 28023 1726853647.77466: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853647.77504: _execute() done 28023 1726853647.77507: dumping result to json 28023 1726853647.77510: done dumping result, returning 28023 1726853647.77512: done running TaskExecutor() for managed_node3/TASK: Verify network state restored to default [02083763-bbaf-fdb6-dad7-0000000000bb] 28023 1726853647.77514: sending task result for task 02083763-bbaf-fdb6-dad7-0000000000bb 28023 1726853647.77744: no more pending results, returning what we have 28023 1726853647.77749: in VariableManager get_vars() 28023 1726853647.77802: Calling all_inventory to load vars for managed_node3 28023 1726853647.77805: Calling groups_inventory to load vars for managed_node3 28023 1726853647.77808: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853647.77937: Calling all_plugins_play to load vars for managed_node3 28023 1726853647.77942: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853647.77946: Calling groups_plugins_play to load vars for managed_node3 28023 1726853647.78556: done sending task result for task 02083763-bbaf-fdb6-dad7-0000000000bb 28023 1726853647.78559: WORKER PROCESS EXITING 28023 1726853647.79357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853647.81128: done with get_vars() 28023 1726853647.81159: variable 'ansible_search_path' from source: unknown 28023 1726853647.81193: we have included files to process 28023 1726853647.81194: generating all_blocks data 28023 1726853647.81197: done generating all_blocks data 28023 1726853647.81202: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 28023 1726853647.81203: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 28023 1726853647.81206: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 28023 1726853647.81622: done processing included file 28023 1726853647.81624: iterating over new_blocks loaded from include file 28023 1726853647.81625: in VariableManager get_vars() 28023 1726853647.81644: done with get_vars() 28023 1726853647.81646: filtering new block on tags 28023 1726853647.81681: done filtering new block on tags 28023 1726853647.81684: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node3 28023 1726853647.81690: extending task lists for all hosts with included blocks 28023 1726853647.83621: done extending task lists 28023 1726853647.83623: done processing included files 28023 1726853647.83623: results queue empty 28023 1726853647.83624: checking for any_errors_fatal 28023 1726853647.83627: done checking for any_errors_fatal 28023 1726853647.83628: checking for max_fail_percentage 28023 1726853647.83629: done checking for max_fail_percentage 28023 1726853647.83630: checking to see if all hosts have failed and the running result is not ok 28023 1726853647.83631: done checking to see if all hosts have failed 28023 1726853647.83632: getting the remaining hosts for this loop 28023 1726853647.83633: done getting the remaining hosts for this loop 28023 1726853647.83635: getting the next task for host managed_node3 28023 1726853647.83639: done getting next task for host managed_node3 28023 1726853647.83641: ^ task is: TASK: Check routes and DNS 28023 1726853647.83644: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853647.83653: getting variables 28023 1726853647.83655: in VariableManager get_vars() 28023 1726853647.83685: Calling all_inventory to load vars for managed_node3 28023 1726853647.83688: Calling groups_inventory to load vars for managed_node3 28023 1726853647.83690: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853647.83696: Calling all_plugins_play to load vars for managed_node3 28023 1726853647.83698: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853647.83701: Calling groups_plugins_play to load vars for managed_node3 28023 1726853647.85103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853647.88278: done with get_vars() 28023 1726853647.88308: done getting variables 28023 1726853647.88355: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 13:34:07 -0400 (0:00:00.125) 0:00:39.969 ****** 28023 1726853647.88592: entering _queue_task() for managed_node3/shell 28023 1726853647.89160: worker is 1 (out of 1 available) 28023 1726853647.89574: exiting _queue_task() for managed_node3/shell 28023 1726853647.89585: done queuing things up, now waiting for results queue to drain 28023 1726853647.89587: waiting for pending results... 28023 1726853647.89673: running TaskExecutor() for managed_node3/TASK: Check routes and DNS 28023 1726853647.89983: in run() - task 02083763-bbaf-fdb6-dad7-000000000bb6 28023 1726853647.90007: variable 'ansible_search_path' from source: unknown 28023 1726853647.90015: variable 'ansible_search_path' from source: unknown 28023 1726853647.90250: calling self._execute() 28023 1726853647.90478: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853647.90482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853647.90486: variable 'omit' from source: magic vars 28023 1726853647.90979: variable 'ansible_distribution_major_version' from source: facts 28023 1726853647.91002: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853647.91008: variable 'omit' from source: magic vars 28023 1726853647.91058: variable 'omit' from source: magic vars 28023 1726853647.91096: variable 'omit' from source: magic vars 28023 1726853647.91135: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853647.91182: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853647.91202: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853647.91220: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853647.91234: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853647.91270: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853647.91275: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853647.91278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853647.91473: Set connection var ansible_shell_type to sh 28023 1726853647.91477: Set connection var ansible_shell_executable to /bin/sh 28023 1726853647.91480: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853647.91482: Set connection var ansible_connection to ssh 28023 1726853647.91484: Set connection var ansible_pipelining to False 28023 1726853647.91486: Set connection var ansible_timeout to 10 28023 1726853647.91488: variable 'ansible_shell_executable' from source: unknown 28023 1726853647.91490: variable 'ansible_connection' from source: unknown 28023 1726853647.91493: variable 'ansible_module_compression' from source: unknown 28023 1726853647.91495: variable 'ansible_shell_type' from source: unknown 28023 1726853647.91498: variable 'ansible_shell_executable' from source: unknown 28023 1726853647.91499: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853647.91502: variable 'ansible_pipelining' from source: unknown 28023 1726853647.91504: variable 'ansible_timeout' from source: unknown 28023 1726853647.91506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853647.91589: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853647.91599: variable 'omit' from source: magic vars 28023 1726853647.91605: starting attempt loop 28023 1726853647.91608: running the handler 28023 1726853647.91617: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853647.91634: _low_level_execute_command(): starting 28023 1726853647.91641: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853647.92458: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853647.92478: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853647.92493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853647.92513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853647.92572: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853647.92640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853647.92665: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853647.92786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853647.92972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853647.94682: stdout chunk (state=3): >>>/root <<< 28023 1726853647.94775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853647.94812: stderr chunk (state=3): >>><<< 28023 1726853647.94814: stdout chunk (state=3): >>><<< 28023 1726853647.94831: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853647.94878: _low_level_execute_command(): starting 28023 1726853647.94881: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853647.9483776-29796-58747463244427 `" && echo ansible-tmp-1726853647.9483776-29796-58747463244427="` echo /root/.ansible/tmp/ansible-tmp-1726853647.9483776-29796-58747463244427 `" ) && sleep 0' 28023 1726853647.95255: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853647.95270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853647.95295: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853647.95298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853647.95345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853647.95349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853647.95419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853647.97353: stdout chunk (state=3): >>>ansible-tmp-1726853647.9483776-29796-58747463244427=/root/.ansible/tmp/ansible-tmp-1726853647.9483776-29796-58747463244427 <<< 28023 1726853647.97461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853647.97491: stderr chunk (state=3): >>><<< 28023 1726853647.97493: stdout chunk (state=3): >>><<< 28023 1726853647.97505: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853647.9483776-29796-58747463244427=/root/.ansible/tmp/ansible-tmp-1726853647.9483776-29796-58747463244427 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853647.97577: variable 'ansible_module_compression' from source: unknown 28023 1726853647.97581: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28023 1726853647.97607: variable 'ansible_facts' from source: unknown 28023 1726853647.97659: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853647.9483776-29796-58747463244427/AnsiballZ_command.py 28023 1726853647.97763: Sending initial data 28023 1726853647.97767: Sent initial data (155 bytes) 28023 1726853647.98153: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853647.98160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853647.98192: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853647.98195: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853647.98198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853647.98200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853647.98262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853647.98289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853647.98360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853647.99991: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853648.00035: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853648.00125: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpi93v31un /root/.ansible/tmp/ansible-tmp-1726853647.9483776-29796-58747463244427/AnsiballZ_command.py <<< 28023 1726853648.00128: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853647.9483776-29796-58747463244427/AnsiballZ_command.py" <<< 28023 1726853648.00174: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpi93v31un" to remote "/root/.ansible/tmp/ansible-tmp-1726853647.9483776-29796-58747463244427/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853647.9483776-29796-58747463244427/AnsiballZ_command.py" <<< 28023 1726853648.00896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853648.00934: stderr chunk (state=3): >>><<< 28023 1726853648.00978: stdout chunk (state=3): >>><<< 28023 1726853648.00981: done transferring module to remote 28023 1726853648.00983: _low_level_execute_command(): starting 28023 1726853648.00986: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853647.9483776-29796-58747463244427/ /root/.ansible/tmp/ansible-tmp-1726853647.9483776-29796-58747463244427/AnsiballZ_command.py && sleep 0' 28023 1726853648.01481: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853648.01504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853648.01589: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853648.03502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853648.03510: stdout chunk (state=3): >>><<< 28023 1726853648.03518: stderr chunk (state=3): >>><<< 28023 1726853648.03539: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853648.03545: _low_level_execute_command(): starting 28023 1726853648.03551: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853647.9483776-29796-58747463244427/AnsiballZ_command.py && sleep 0' 28023 1726853648.04177: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853648.04210: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853648.04215: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853648.04230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853648.04312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853648.20669: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:2a:53:36:f0:e9 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.217/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2832sec preferred_lft 2832sec\n inet6 fe80::102a:53ff:fe36:f0e9/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\n30: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether 6e:7c:f1:8e:1c:81 brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.217 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.217 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:34:08.196292", "end": "2024-09-20 13:34:08.205331", "delta": "0:00:00.009039", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28023 1726853648.22445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853648.22449: stdout chunk (state=3): >>><<< 28023 1726853648.22452: stderr chunk (state=3): >>><<< 28023 1726853648.22476: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:2a:53:36:f0:e9 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.217/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2832sec preferred_lft 2832sec\n inet6 fe80::102a:53ff:fe36:f0e9/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\n30: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether 6e:7c:f1:8e:1c:81 brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.217 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.217 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:34:08.196292", "end": "2024-09-20 13:34:08.205331", "delta": "0:00:00.009039", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853648.22682: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853647.9483776-29796-58747463244427/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853648.22686: _low_level_execute_command(): starting 28023 1726853648.22689: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853647.9483776-29796-58747463244427/ > /dev/null 2>&1 && sleep 0' 28023 1726853648.23920: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853648.23944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853648.23961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853648.24158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853648.24295: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853648.24391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853648.26306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853648.26354: stderr chunk (state=3): >>><<< 28023 1726853648.26360: stdout chunk (state=3): >>><<< 28023 1726853648.26363: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853648.26366: handler run complete 28023 1726853648.26466: Evaluated conditional (False): False 28023 1726853648.26469: attempt loop complete, returning result 28023 1726853648.26473: _execute() done 28023 1726853648.26475: dumping result to json 28023 1726853648.26477: done dumping result, returning 28023 1726853648.26479: done running TaskExecutor() for managed_node3/TASK: Check routes and DNS [02083763-bbaf-fdb6-dad7-000000000bb6] 28023 1726853648.26481: sending task result for task 02083763-bbaf-fdb6-dad7-000000000bb6 28023 1726853648.26548: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000bb6 28023 1726853648.26551: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009039", "end": "2024-09-20 13:34:08.205331", "rc": 0, "start": "2024-09-20 13:34:08.196292" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:2a:53:36:f0:e9 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.11.217/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 2832sec preferred_lft 2832sec inet6 fe80::102a:53ff:fe36:f0e9/64 scope link noprefixroute valid_lft forever preferred_lft forever 30: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000 link/ether 6e:7c:f1:8e:1c:81 brd ff:ff:ff:ff:ff:ff inet 192.0.2.72/31 scope global noprefixroute rpltstbr valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.217 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.217 metric 100 192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 28023 1726853648.26633: no more pending results, returning what we have 28023 1726853648.26639: results queue empty 28023 1726853648.26639: checking for any_errors_fatal 28023 1726853648.26641: done checking for any_errors_fatal 28023 1726853648.26642: checking for max_fail_percentage 28023 1726853648.26643: done checking for max_fail_percentage 28023 1726853648.26644: checking to see if all hosts have failed and the running result is not ok 28023 1726853648.26645: done checking to see if all hosts have failed 28023 1726853648.26646: getting the remaining hosts for this loop 28023 1726853648.26648: done getting the remaining hosts for this loop 28023 1726853648.26651: getting the next task for host managed_node3 28023 1726853648.26661: done getting next task for host managed_node3 28023 1726853648.26664: ^ task is: TASK: Verify DNS and network connectivity 28023 1726853648.26667: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28023 1726853648.26681: getting variables 28023 1726853648.26683: in VariableManager get_vars() 28023 1726853648.26722: Calling all_inventory to load vars for managed_node3 28023 1726853648.26724: Calling groups_inventory to load vars for managed_node3 28023 1726853648.26726: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853648.26738: Calling all_plugins_play to load vars for managed_node3 28023 1726853648.26740: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853648.26743: Calling groups_plugins_play to load vars for managed_node3 28023 1726853648.28237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853648.29838: done with get_vars() 28023 1726853648.29864: done getting variables 28023 1726853648.29925: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 13:34:08 -0400 (0:00:00.413) 0:00:40.383 ****** 28023 1726853648.29956: entering _queue_task() for managed_node3/shell 28023 1726853648.30300: worker is 1 (out of 1 available) 28023 1726853648.30312: exiting _queue_task() for managed_node3/shell 28023 1726853648.30324: done queuing things up, now waiting for results queue to drain 28023 1726853648.30325: waiting for pending results... 28023 1726853648.30697: running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity 28023 1726853648.30716: in run() - task 02083763-bbaf-fdb6-dad7-000000000bb7 28023 1726853648.30730: variable 'ansible_search_path' from source: unknown 28023 1726853648.30733: variable 'ansible_search_path' from source: unknown 28023 1726853648.30768: calling self._execute() 28023 1726853648.30883: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853648.30888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853648.30977: variable 'omit' from source: magic vars 28023 1726853648.31267: variable 'ansible_distribution_major_version' from source: facts 28023 1726853648.31281: Evaluated conditional (ansible_distribution_major_version != '6'): True 28023 1726853648.31419: variable 'ansible_facts' from source: unknown 28023 1726853648.32168: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 28023 1726853648.32176: variable 'omit' from source: magic vars 28023 1726853648.32233: variable 'omit' from source: magic vars 28023 1726853648.32378: variable 'omit' from source: magic vars 28023 1726853648.32382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28023 1726853648.32385: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28023 1726853648.32387: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28023 1726853648.32390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853648.32403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28023 1726853648.32437: variable 'inventory_hostname' from source: host vars for 'managed_node3' 28023 1726853648.32440: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853648.32443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853648.32553: Set connection var ansible_shell_type to sh 28023 1726853648.32566: Set connection var ansible_shell_executable to /bin/sh 28023 1726853648.32569: Set connection var ansible_module_compression to ZIP_DEFLATED 28023 1726853648.32574: Set connection var ansible_connection to ssh 28023 1726853648.32580: Set connection var ansible_pipelining to False 28023 1726853648.32586: Set connection var ansible_timeout to 10 28023 1726853648.32615: variable 'ansible_shell_executable' from source: unknown 28023 1726853648.32618: variable 'ansible_connection' from source: unknown 28023 1726853648.32621: variable 'ansible_module_compression' from source: unknown 28023 1726853648.32624: variable 'ansible_shell_type' from source: unknown 28023 1726853648.32626: variable 'ansible_shell_executable' from source: unknown 28023 1726853648.32628: variable 'ansible_host' from source: host vars for 'managed_node3' 28023 1726853648.32676: variable 'ansible_pipelining' from source: unknown 28023 1726853648.32679: variable 'ansible_timeout' from source: unknown 28023 1726853648.32682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 28023 1726853648.32791: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853648.32801: variable 'omit' from source: magic vars 28023 1726853648.32810: starting attempt loop 28023 1726853648.32814: running the handler 28023 1726853648.32833: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28023 1726853648.32837: _low_level_execute_command(): starting 28023 1726853648.32846: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28023 1726853648.33593: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853648.33679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853648.33701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853648.33711: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853648.33733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853648.33828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853648.35527: stdout chunk (state=3): >>>/root <<< 28023 1726853648.35653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853648.35660: stdout chunk (state=3): >>><<< 28023 1726853648.35666: stderr chunk (state=3): >>><<< 28023 1726853648.35690: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853648.35702: _low_level_execute_command(): starting 28023 1726853648.35709: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853648.3569138-29825-80452841408330 `" && echo ansible-tmp-1726853648.3569138-29825-80452841408330="` echo /root/.ansible/tmp/ansible-tmp-1726853648.3569138-29825-80452841408330 `" ) && sleep 0' 28023 1726853648.36132: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853648.36144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28023 1726853648.36168: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 28023 1726853648.36176: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853648.36221: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853648.36225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853648.36294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853648.38245: stdout chunk (state=3): >>>ansible-tmp-1726853648.3569138-29825-80452841408330=/root/.ansible/tmp/ansible-tmp-1726853648.3569138-29825-80452841408330 <<< 28023 1726853648.38343: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853648.38370: stderr chunk (state=3): >>><<< 28023 1726853648.38378: stdout chunk (state=3): >>><<< 28023 1726853648.38394: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853648.3569138-29825-80452841408330=/root/.ansible/tmp/ansible-tmp-1726853648.3569138-29825-80452841408330 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853648.38424: variable 'ansible_module_compression' from source: unknown 28023 1726853648.38465: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28023xdlkztex/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28023 1726853648.38497: variable 'ansible_facts' from source: unknown 28023 1726853648.38556: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853648.3569138-29825-80452841408330/AnsiballZ_command.py 28023 1726853648.38660: Sending initial data 28023 1726853648.38664: Sent initial data (155 bytes) 28023 1726853648.39112: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28023 1726853648.39115: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853648.39118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853648.39120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853648.39122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853648.39124: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853648.39126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 28023 1726853648.39128: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853648.39177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853648.39181: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853648.39183: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853648.39246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853648.40851: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28023 1726853648.40900: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28023 1726853648.40953: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28023xdlkztex/tmpdkumnluc /root/.ansible/tmp/ansible-tmp-1726853648.3569138-29825-80452841408330/AnsiballZ_command.py <<< 28023 1726853648.40965: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853648.3569138-29825-80452841408330/AnsiballZ_command.py" <<< 28023 1726853648.41009: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-28023xdlkztex/tmpdkumnluc" to remote "/root/.ansible/tmp/ansible-tmp-1726853648.3569138-29825-80452841408330/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853648.3569138-29825-80452841408330/AnsiballZ_command.py" <<< 28023 1726853648.41626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853648.41663: stderr chunk (state=3): >>><<< 28023 1726853648.41667: stdout chunk (state=3): >>><<< 28023 1726853648.41696: done transferring module to remote 28023 1726853648.41705: _low_level_execute_command(): starting 28023 1726853648.41710: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853648.3569138-29825-80452841408330/ /root/.ansible/tmp/ansible-tmp-1726853648.3569138-29825-80452841408330/AnsiballZ_command.py && sleep 0' 28023 1726853648.42364: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853648.42386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853648.42404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853648.42496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853648.44344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853648.44374: stderr chunk (state=3): >>><<< 28023 1726853648.44377: stdout chunk (state=3): >>><<< 28023 1726853648.44390: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853648.44393: _low_level_execute_command(): starting 28023 1726853648.44399: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853648.3569138-29825-80452841408330/AnsiballZ_command.py && sleep 0' 28023 1726853648.44834: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853648.44838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853648.44840: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853648.44843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853648.44845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853648.44898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 28023 1726853648.44905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28023 1726853648.44907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28023 1726853648.44967: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853648.88090: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1444 0 --:--:-- --:--:-- --:--:-- 1445\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 7860 0 --:--:-- --:--:-- --:--:-- 8083", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:34:08.604227", "end": "2024-09-20 13:34:08.876559", "delta": "0:00:00.272332", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28023 1726853648.89735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 28023 1726853648.89858: stderr chunk (state=3): >>><<< 28023 1726853648.89862: stdout chunk (state=3): >>><<< 28023 1726853648.89891: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1444 0 --:--:-- --:--:-- --:--:-- 1445\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 7860 0 --:--:-- --:--:-- --:--:-- 8083", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:34:08.604227", "end": "2024-09-20 13:34:08.876559", "delta": "0:00:00.272332", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 28023 1726853648.89934: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853648.3569138-29825-80452841408330/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28023 1726853648.89941: _low_level_execute_command(): starting 28023 1726853648.89946: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853648.3569138-29825-80452841408330/ > /dev/null 2>&1 && sleep 0' 28023 1726853648.91076: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28023 1726853648.91080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 28023 1726853648.91082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853648.91084: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28023 1726853648.91149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 28023 1726853648.91152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28023 1726853648.91390: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 28023 1726853648.93334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28023 1726853648.93338: stderr chunk (state=3): >>><<< 28023 1726853648.93340: stdout chunk (state=3): >>><<< 28023 1726853648.93492: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28023 1726853648.93498: handler run complete 28023 1726853648.93577: Evaluated conditional (False): False 28023 1726853648.93581: attempt loop complete, returning result 28023 1726853648.93583: _execute() done 28023 1726853648.93585: dumping result to json 28023 1726853648.93587: done dumping result, returning 28023 1726853648.93589: done running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity [02083763-bbaf-fdb6-dad7-000000000bb7] 28023 1726853648.93590: sending task result for task 02083763-bbaf-fdb6-dad7-000000000bb7 ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.272332", "end": "2024-09-20 13:34:08.876559", "rc": 0, "start": "2024-09-20 13:34:08.604227" } STDOUT: CHECK DNS AND CONNECTIVITY 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 1444 0 --:--:-- --:--:-- --:--:-- 1445 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 7860 0 --:--:-- --:--:-- --:--:-- 8083 28023 1726853648.93839: no more pending results, returning what we have 28023 1726853648.93842: results queue empty 28023 1726853648.93843: checking for any_errors_fatal 28023 1726853648.93855: done checking for any_errors_fatal 28023 1726853648.93856: checking for max_fail_percentage 28023 1726853648.93857: done checking for max_fail_percentage 28023 1726853648.93858: checking to see if all hosts have failed and the running result is not ok 28023 1726853648.93859: done checking to see if all hosts have failed 28023 1726853648.93860: getting the remaining hosts for this loop 28023 1726853648.93862: done getting the remaining hosts for this loop 28023 1726853648.93865: getting the next task for host managed_node3 28023 1726853648.93879: done getting next task for host managed_node3 28023 1726853648.93881: ^ task is: TASK: meta (flush_handlers) 28023 1726853648.93887: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853648.93893: getting variables 28023 1726853648.93895: in VariableManager get_vars() 28023 1726853648.93941: Calling all_inventory to load vars for managed_node3 28023 1726853648.93944: Calling groups_inventory to load vars for managed_node3 28023 1726853648.93946: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853648.93959: Calling all_plugins_play to load vars for managed_node3 28023 1726853648.93963: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853648.93966: Calling groups_plugins_play to load vars for managed_node3 28023 1726853648.94879: done sending task result for task 02083763-bbaf-fdb6-dad7-000000000bb7 28023 1726853648.94883: WORKER PROCESS EXITING 28023 1726853648.96602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853648.99744: done with get_vars() 28023 1726853648.99775: done getting variables 28023 1726853648.99837: in VariableManager get_vars() 28023 1726853648.99852: Calling all_inventory to load vars for managed_node3 28023 1726853648.99854: Calling groups_inventory to load vars for managed_node3 28023 1726853648.99856: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853648.99861: Calling all_plugins_play to load vars for managed_node3 28023 1726853648.99863: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853648.99865: Calling groups_plugins_play to load vars for managed_node3 28023 1726853649.01765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853649.03400: done with get_vars() 28023 1726853649.03437: done queuing things up, now waiting for results queue to drain 28023 1726853649.03440: results queue empty 28023 1726853649.03441: checking for any_errors_fatal 28023 1726853649.03445: done checking for any_errors_fatal 28023 1726853649.03446: checking for max_fail_percentage 28023 1726853649.03447: done checking for max_fail_percentage 28023 1726853649.03447: checking to see if all hosts have failed and the running result is not ok 28023 1726853649.03448: done checking to see if all hosts have failed 28023 1726853649.03449: getting the remaining hosts for this loop 28023 1726853649.03450: done getting the remaining hosts for this loop 28023 1726853649.03453: getting the next task for host managed_node3 28023 1726853649.03456: done getting next task for host managed_node3 28023 1726853649.03458: ^ task is: TASK: meta (flush_handlers) 28023 1726853649.03459: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853649.03462: getting variables 28023 1726853649.03463: in VariableManager get_vars() 28023 1726853649.03478: Calling all_inventory to load vars for managed_node3 28023 1726853649.03481: Calling groups_inventory to load vars for managed_node3 28023 1726853649.03483: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853649.03488: Calling all_plugins_play to load vars for managed_node3 28023 1726853649.03490: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853649.03493: Calling groups_plugins_play to load vars for managed_node3 28023 1726853649.04728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853649.06595: done with get_vars() 28023 1726853649.06616: done getting variables 28023 1726853649.06669: in VariableManager get_vars() 28023 1726853649.06691: Calling all_inventory to load vars for managed_node3 28023 1726853649.06694: Calling groups_inventory to load vars for managed_node3 28023 1726853649.06696: Calling all_plugins_inventory to load vars for managed_node3 28023 1726853649.06701: Calling all_plugins_play to load vars for managed_node3 28023 1726853649.06703: Calling groups_plugins_inventory to load vars for managed_node3 28023 1726853649.06706: Calling groups_plugins_play to load vars for managed_node3 28023 1726853649.09278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28023 1726853649.11511: done with get_vars() 28023 1726853649.11544: done queuing things up, now waiting for results queue to drain 28023 1726853649.11546: results queue empty 28023 1726853649.11547: checking for any_errors_fatal 28023 1726853649.11548: done checking for any_errors_fatal 28023 1726853649.11549: checking for max_fail_percentage 28023 1726853649.11550: done checking for max_fail_percentage 28023 1726853649.11551: checking to see if all hosts have failed and the running result is not ok 28023 1726853649.11552: done checking to see if all hosts have failed 28023 1726853649.11553: getting the remaining hosts for this loop 28023 1726853649.11553: done getting the remaining hosts for this loop 28023 1726853649.11556: getting the next task for host managed_node3 28023 1726853649.11562: done getting next task for host managed_node3 28023 1726853649.11563: ^ task is: None 28023 1726853649.11565: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28023 1726853649.11566: done queuing things up, now waiting for results queue to drain 28023 1726853649.11567: results queue empty 28023 1726853649.11567: checking for any_errors_fatal 28023 1726853649.11568: done checking for any_errors_fatal 28023 1726853649.11569: checking for max_fail_percentage 28023 1726853649.11570: done checking for max_fail_percentage 28023 1726853649.11575: checking to see if all hosts have failed and the running result is not ok 28023 1726853649.11576: done checking to see if all hosts have failed 28023 1726853649.11578: getting the next task for host managed_node3 28023 1726853649.11581: done getting next task for host managed_node3 28023 1726853649.11581: ^ task is: None 28023 1726853649.11582: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=108 changed=3 unreachable=0 failed=0 skipped=87 rescued=0 ignored=2 Friday 20 September 2024 13:34:09 -0400 (0:00:00.817) 0:00:41.201 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.02s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.86s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.81s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.72s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:6 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.63s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 1.15s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Create veth interface ethtest0 ------------------------------------------ 1.13s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Gathering Facts --------------------------------------------------------- 1.11s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:3 Create veth interface ethtest1 ------------------------------------------ 1.08s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.90s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Verify DNS and network connectivity ------------------------------------- 0.82s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Check if system is ostree ----------------------------------------------- 0.79s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.79s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Install iproute --------------------------------------------------------- 0.77s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which packages are installed --- 0.75s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 0.74s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gather the minimum subset of ansible_facts required by the network role test --- 0.71s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.69s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Install iproute --------------------------------------------------------- 0.66s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.65s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 28023 1726853649.11781: RUNNING CLEANUP